homehome Home chatchat Notifications


Scientists urge ban on AIs designed to predict crime, Minority Report-style

You can't 'predict' crime without being racially biased because 'criminality' itself is racially biased, experts warn.

Tibi Puiu
June 25, 2020 @ 1:58 pm

share Share

A controversial research employing automated facial recognition algorithms to predict if a person will commit a crime is due to be published in an upcoming book. But over 1,700 experts, researchers, and academics from AI research have signed an open letter opposing such research, citing “grave concerns” over the study and urging Springer, the publisher of the book, to withdraw its offer.

Still from the movie Minority Report, staring Tom Cruise. Credit: DreamWorks.

The research, led by a team from Harrisburg University in the U.S., is proposing technology that can predict if someone will commit a crime, a scenario reminiscent of the science fiction book and movie Minority Report — only this time, it’s no fiction.

Would-be offenders can be identified solely by their face with “80% accuracy and with no racial bias” by exploiting huge police datasets of criminal data and biometrics. Layers of deep neural networks then make sense of this data to “produce tools for crime prevention, law enforcement, and military applications that are less impacted by implicit biases and emotional responses,” according to Nathaniel Ashby, a Harrisburg University professor and co-author of the study slated for publishing in the upcoming book series “Springer Nature — Research Book Series: Transactions on Computational Science and Computational Intelligence.”

However, the research community at large begs to differ. Writing to the Springer editorial committee in a recent open letter, over a thousand experts argue that predictive policing software is anything but unbiased. They cite published research showing that facial recognition software is deeply flawed and often works poorly when identifying non-white faces.

“Machine learning programs are not neutral; research agendas and the data sets they work with often inherit dominant cultural beliefs about the world,” the authors wrote.

“The uncritical acceptance of default assumptions inevitably leads to discriminatory design in algorithmic systems, reproducing ideas which normalize social hierarchies and legitimize violence against marginalized groups.”

Studies show that people of color are more likely to be treated harshly than white people at every stage of the legal system. Any software built on existing criminal legal frameworks will inevitably inherit these distortions in the data. In other words, the machine will repeat the same prejudices when it comes to determining if a person has the “face of a criminal”, which echoes the 19th-century pseudoscience of physiognomy — the practice of assessing a person’s character or personality from their outer appearance.

“Let’s be clear: there is no way to develop a system that can predict or identify “criminality” that is not racially biased — because the category of “criminality” itself is racially biased,” the authors said.

Lastly, it’s not just the way the AI is trained and data bias — the science itself is shaky at best. The idea that criminality can be predicted in any way is dubious or questionable at best.

Artificial intelligence can definitely be a source for good. Machine learning algorithms are radically transforming healthcare, for instance by allowing professionals to identify certain tumors with greater accuracy than seasoned oncologists. Investors like Tej Kohli and Andreesen Horowitz have bet billions on the next generation of AI-enabled robotics, such as robotic surgeons and bionic arms, to name a few.

But, as we see now, AI can also lead to nefarious outcomes, and it’s still an immature field. After all, such machines are no more ethical or unbiased than their human designers and the data they are fed.

Researchers around the world are rising against algorithmically predictive law enforcement. Also this week, a group of American mathematicians wrote an open letter in the Notices of the American Mathematical Society in which they urge their peers no to work on such software.

The authors of this letter are against any kind of predictive law-enforcement software. Rather than identifying would-be criminals solely by their face, some of this software supposedly “predict” crimes before they happen, thus signaling law enforcement where to direct their resources.

“In light of the extrajudicial murders by police of George Floyd, Breonna Taylor, Tony McDade and numerous others before them, and the subsequent brutality of the police response to protests, we call on the mathematics community to boycott working with police departments,” the letter states.

“Given the structural racism and brutality in US policing, we do not believe that mathematicians should be collaborating with police departments in this manner,” the authors state. “It is simply too easy to create a ‘scientific’ veneer for racism. Please join us in committing to not collaborating with police. It is, at this moment, the very least we can do as a community.”

share Share

Climate Change Unleashed a Hidden Wave That Triggered a Planetary Tremor

The Earth was trembling every 90 seconds. Now, we know why.

Archaeologists May Have Found Odysseus’ Sanctuary on Ithaca

A new discovery ties myth to place, revealing centuries of cult worship and civic ritual.

The World’s Largest Sand Battery Just Went Online in Finland. It could change renewable energy

This sand battery system can store 1,000 megawatt-hours of heat for weeks at a time.

A Hidden Staircase in a French Church Just Led Archaeologists Into the Middle Ages

They pulled up a church floor and found a staircase that led to 1500 years of history.

The World’s Largest Camera Is About to Change Astronomy Forever

A new telescope camera promises a 10-year, 3.2-billion-pixel journey through the southern sky.

AI 'Reanimated' a Murder Victim Back to Life to Speak in Court (And Raises Ethical Quandaries)

AI avatars of dead people are teaching courses and testifying in court. Even with the best of intentions, the emerging practice of AI ‘reanimations’ is an ethical quagmire.

This Rare Viking Burial of a Woman and Her Dog Shows That Grief and Love Haven’t Changed in a Thousand Years

The power of loyalty, in this life and the next.

This EV Battery Charges in 18 Seconds and It’s Already Street Legal

RML’s VarEVolt battery is blazing a trail for ultra-fast EV charging and hypercar performance.

DARPA Just Beamed Power Over 5 Miles Using Lasers and Used It To Make Popcorn

A record-breaking laser beam could redefine how we send power to the world's hardest places.

Why Do Some Birds Sing More at Dawn? It's More About Social Behavior Than The Environment

Study suggests birdsong patterns are driven more by social needs than acoustics.