ZME Science
No Result
View All Result
ZME Science
No Result
View All Result
ZME Science

Home → Science → News

NYC Man Was Jailed for Days Because of a Blurry CCTV Image and a Faulty AI Match

Flawed tech, false ID, and two days behind bars: how it happened anyway.

Tudor TaritabyTudor Tarita
September 7, 2025
in Future, News, Tech
A A
Edited and reviewed by Tibi Puiu
Share on FacebookShare on TwitterSubmit to Reddit

On an April day, Trevis Williams was stopped by subway police in Brooklyn and taken into custody. He didn’t know what was going on.

Two days later, he was still sitting in jail. The charge? Exposing himself to a woman in a Manhattan building—about 19 kilometers away from where he actually was. Williams is 1.88 meters tall and weighs around 104 kilograms. The suspect described by the victim was about 1.68 meters tall and roughly 73 kilograms.

The thing that connected them was an AI facial recognition match based on grainy CCTV video.

Portrait of Trevis Williams. Credit: Natalie Keyssa/New York Times

A Flawed Match

The NYPD has been using facial recognition technology since 2011. Between 2007 and 2020, it spent more than $2.8 billion on surveillance tools—including stingray phone trackers, crime prediction software, and X-ray vans. The department now runs countless of facial recognition searches every year.

The technology’s use in the Williams case followed a now-familiar pattern. Investigators fed a blurry still from grainy CCTV footage into the department’s system. An algorithm transformed the face into a series of data points and returned six possible matches. All of them were Black men with dreadlocks and facial hair.

Williams had been arrested a few months earlier on an unrelated misdemeanor charge, so his mug shot still lingered in the system. An examiner chose his photo as a “possible match.” A report even warned: “not probable cause to arrest.”

Still, detectives used the photo in a lineup. The victim pointed to him. “Confident it is him,” a detective wrote.

RelatedPosts

AI Visual Trickery Is Already Invading the Housing Market
AI has a hidden water cost − here’s how to calculate yours
Scientists urge ban on AIs designed to predict crime, Minority Report-style
How AI imagery could be used to develop fake archaeology

That was enough for police to make the arrest. They didn’t check his phone records, verify his alibi, or contact his employer.

When shown the surveillance still, Williams pleaded: “That’s not me, man. I swear to God, that’s not me.” A detective replied: “Of course you’re going to say that wasn’t you.”

Surveillance Meets Eyewitness Memory

An NYPD security camera is pictured on Neptune Ave. in Brooklyn, New York in 2024.

The woman who made the initial complaint told police she had seen the man before. The perpetrator was a delivery worker who lingered in the hallway of her building on East 17th Street in Manhattan. On February 10, she said, he appeared in a hallway mirror, genitals exposed. She screamed. He fled.

But at that moment, Williams was in Marine Park, Brooklyn. Cell phone tower records confirmed it. He had been driving home from his job in Connecticut, where he worked with autistic adults.

It didn’t matter.

He was jailed for more than two days. Prosecutors finally dropped the charges in July, but the damage was done.

“In the blink of an eye, your whole life could change,” Williams said.

Oops, AI Did It Again…

Trevis Williams is not alone.

Across the country, at least 10 people have been wrongly arrested due to facial recognition, according to media reports. Most of them, like Williams, were people of color.

In Detroit, three Black men were wrongly arrested using facial recognition. In one 2022 case, a man was held for over a month before proving he wasn’t at the scene after being falsely identified. The man faced attempted murder charges.

Civil rights groups have issued sharp warnings. “We’ve seen this over and over across the country,” said Nathan Wessler of the ACLU, as per New York Times. “One of the primary dangers of this technology is that it often gets it wrong.”

A 2023 study from the National Institute of Standards and Technology (NIST) found that facial recognition systems could match mugshots with 99.9% accuracy—but only as long as the photos were clear and controlled. But when the images were blurry, dimly lit, or taken at an angle, as is often the case in real life, the error rate climbed.

“It may drop significantly when low-quality or uncontrolled images are used,” said Michael King, a federal advisor who studied the report.

No Guardrails in Place

In some cities, safeguards are built into the process. In Detroit and Indiana, for example, police cannot include a facial recognition match in a photo lineup unless there’s supporting evidence like fingerprints or DNA.

The NYPD has no such rule.

It also doesn’t track how often the tool leads to mistaken arrests. While officials say the technology is only one part of an investigation, critics say that’s misleading.

“Even if there is a possible match, the NYPD cannot and will never make an arrest solely using facial recognition technology,” NYPD spokesperson Brad Weekes told ABC7.

But Williams’s lawyer, Diane Akerman, disputes that: “Traditional police work could have solved this case or at least saved Mr. Williams from going through this.”

The Legal Aid Society, which represented Williams, has asked the city’s Department of Investigation to look into the NYPD’s practices. In a letter, it warned that “the cases we have identified are only the tip of the iceberg.”

The group also accused NYPD’s Intelligence Division of bypassing policy by enlisting other agencies, like the Fire Department (FDNY), to run facial recognition scans that the NYPD itself is barred from doing.

In one case, the FDNY used Clearview AI software, which has been long criticized for its secrecy and lack of oversight, to identify a protester, leading to a now-dismissed charge. STOP, the Surveillance Technology Oversight Project, calls these workarounds “deeply alarming.”

“Everyone, including the NYPD, knows that facial recognition technology is unreliable,” said Akerman. “Yet the NYPD disregards even its own protocols.”

A Future in Limbo

Williams had been preparing to become a correctional officer at Rikers Island. But after the arrest, the hiring process stalled.

“I was so angry…” he told ABC7. “I hope people don’t have to sit in jail or prison for things that they didn’t do.”

He still worries that the arrest will follow him. “Sometimes, I just feel like I’m having panic attacks,” he said.

The public lewdness case has since been closed. No one else has been charged.

Facial recognition technology is often sold as a boon to law enforcement — a tool to unmask criminals hiding in plain sight. But when used recklessly, it just creates new victims.

Williams’s story shows what happens when a powerful algorithm meets a fallible eyewitness without the basic guardrails of good policing.

Tags: AIfacial recognitionlaw enforcement

ShareTweetShare
Tudor Tarita

Tudor Tarita

Aerospace engineer with a passion for biology, paleontology, and physics.

Related Posts

Science

AI has a hidden water cost − here’s how to calculate yours

byLeo Lo
4 days ago
a young blonde girl smiling at camera
Future

Miss England Contestants Are Now Competing With AI Versions of Themselves

byMihai Andrei
5 days ago
Future

ChatGPT only talks in clichés. That’s a threat to human creativity

byVittorio Tantucci
5 days ago
Future

AI Bots Were Made to Use a Stripped Down Social Network With No Curation Algorithms and They Still Formed Toxic Echo Chambers

byRupendra Brahambhatt
1 week ago

Recent news

Ultra-Processed Foods Made Healthy Young Men Gain Fat and Lose Sperm Quality in Just Three Weeks

September 7, 2025

A New Solar Panel Shield Made From Onion Peels Outlasted Industry Plastics in Tests

September 7, 2025

NYC Man Was Jailed for Days Because of a Blurry CCTV Image and a Faulty AI Match

September 7, 2025
  • About
  • Advertise
  • Editorial Policy
  • Privacy Policy and Terms of Use
  • How we review products
  • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.

No Result
View All Result
  • Science News
  • Environment
  • Health
  • Space
  • Future
  • Features
    • Natural Sciences
    • Physics
      • Matter and Energy
      • Quantum Mechanics
      • Thermodynamics
    • Chemistry
      • Periodic Table
      • Applied Chemistry
      • Materials
      • Physical Chemistry
    • Biology
      • Anatomy
      • Biochemistry
      • Ecology
      • Genetics
      • Microbiology
      • Plants and Fungi
    • Geology and Paleontology
      • Planet Earth
      • Earth Dynamics
      • Rocks and Minerals
      • Volcanoes
      • Dinosaurs
      • Fossils
    • Animals
      • Mammals
      • Birds
      • Fish
      • Amphibians
      • Reptiles
      • Invertebrates
      • Pets
      • Conservation
      • Animal facts
    • Climate and Weather
      • Climate change
      • Weather and atmosphere
    • Health
      • Drugs
      • Diseases and Conditions
      • Human Body
      • Mind and Brain
      • Food and Nutrition
      • Wellness
    • History and Humanities
      • Anthropology
      • Archaeology
      • History
      • Economics
      • People
      • Sociology
    • Space & Astronomy
      • The Solar System
      • Sun
      • The Moon
      • Planets
      • Asteroids, meteors & comets
      • Astronomy
      • Astrophysics
      • Cosmology
      • Exoplanets & Alien Life
      • Spaceflight and Exploration
    • Technology
      • Computer Science & IT
      • Engineering
      • Inventions
      • Sustainability
      • Renewable Energy
      • Green Living
    • Culture
    • Resources
  • Videos
  • Reviews
  • About Us
    • About
    • The Team
    • Advertise
    • Contribute
    • Editorial policy
    • Privacy Policy
    • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.