Faulty facial recognition leads to false imprisonment

On March 26, 2022, a violent attack took place on a Maryland Transit Administration bus in suburban Baltimore. The attacker punched the female bus driver several times after an argument over COVID masking rules, then fled with her cellphone.
According to a recent New Yorker report, surveillance cameras captured images of the attacker. Transit Police used these images to create a Be On the Lookout newsletter distributed to law enforcement agencies.
An analyst with the Harford County State’s Attorney’s Office analyzed a surveillance image using facial recognition software. The algorithm matched Alonzo Cornelius Sawyer, a black man in his 50s from Abingdon, Maryland.
Sawyer was arrested a few days later while in court on an unrelated matter.
Police questioned him and showed him the BOLO images, which he claimed were not of him – but they dismissed his claims after his probation officer, Arron Daugherty, positively identified Sawyer as the attacker after viewing the images. Daugherty had only met Sawyer briefly twice before, while Sawyer wore a mask.
Sawyer’s wife, Carronne Jones-Sawyer, also categorically denied the images shown to her husband, citing physical differences in age, build, clothing and more. She provided potential alibis, leading Sawyer away from the scene at the time of the assault. However, detectives did not conduct any further investigations to corroborate the facial recognition match.
AI racial bias
This case illustrates the risks of excessive dependence on AI tools without sufficient standards.
Racial bias leads facial recognition systems to misidentify people of color at much higher rates. Algorithmic matching trumped conflicting eyewitness accounts during the police investigation.
After a month in jail, the charges were dropped when Daugherty admitted his doubts after meeting Sawyer’s wife. The use of facial recognition was never disclosed to Sawyer. Neither he nor his wife were informed of the arrest of another man.
The story highlights concerns about inadequate facial recognition training, lack of corroboration, failure to disclose use of the technology and confirmation bias, which led police to reject evidence contradictory.
Critics argue that the use of facial recognition should be banned or strictly limited, given its potential for abuse and entrenched injustice without oversight. Sawyer believes he would have wrongly pleaded guilty without his wife’s intervention, demonstrating how the practice can enable overzealous prosecution.
As rapid advances in AI spread, the public needs protection from unproven technologies. Sawyer’s experience highlights the urgent need for reform, transparency and accountability to prevent more wrongful arrests.
Featured image credit: Cottonbro Studio; Pixels; THANKS!