homehome Home chatchat Notifications


Machine learning is bringing back an infamous pseudoscience used to fuel racism

The pseudoscientific practice of physiognomy, dismissed as junk science centuries ago, is seeing a high-tech revival.

Mihai Andrei
January 24, 2025 @ 7:50 pm

share Share

a face with an algorithm mesh on it
Physiognomy lacks the scientific basis its proponents claim. Machine learning won’t change that. Image created by AI / Midjourney.

In the 17th century, Swiss poet and preacher Johann Kaspar Lavater brought back an ancient “science.” Physiognomy, the idea of judging someone’s personality based on their face, was not a new idea. It had been discussed by some ancient philosophers but fell into disrepute because, well, it didn’t work. Still, Lavater brought it back.

He wrote several popular essays, drawing mixed reactions but ultimately popularizing physiognomy, especially in the field of criminology. Even though physiognomy didn’t work, it found its proponents. It was useful as a tool for segregation and to push the idea that some races are better than others. Lavater’s bias towards European physiology led to racial stereotyping. He would write, for instance, that Jewish features were a sign of “neither generosity, nor tenderness, nor elevation of mind.”

European colonialists and race theorists used this to argue for the supposed superiority of white Europeans, claiming that other races and ethnicities reflect inferiority and criminality. Before falling firmly into the bin of junk science, physiognomy was used to influence eugenics and fascist ideologies.

Now, machine learning is bringing this back.

Your face is not who you are

Proponents of neo-physiognomy argue that deep neural networks (DNNs) uncover correlations that human judgment cannot. Purportedly, they can achieve unprecedented accuracy in identifying latent traits. However, the allure of high-performance metrics masks a deeper issue: the lack of scientific legitimacy.

“We hold up the renewed emergence of physiognomic methods, facilitated by ML, as a case study in the harmful repercussions of ML-laundered junk science,” write the authors of a new study.

“Research in the physiognomic tradition goes back centuries, and while the methods largely fell out of favor with the downfall of the Third Reich, the prospects of ML have renewed scientific interest in the subject,” they add.

Machine learning systems, particularly those involving DNNs, are exceptionally adept at detecting patterns in data. Yet, their results are only as valid as the data and assumptions they are built upon.

Image of physiognomy of nose geometry from a 1889 book
Image from an 1889 book on physiognomy attempting to infer people’s personality bassed on their nose geometry.

These models often fail to address confounding variables, overfit to biased datasets, and lack causal mechanisms for their claims. For example, studies claiming to infer criminality or sexual orientation from facial features base their conclusions on datasets that may reflect societal biases rather than inherent traits. A notable example includes training data scraped from social media, laden with cultural and contextual biases.

Resurgence of a pseudoscience

A recent surge of deep learning-based studies have claimed the ability to predict unobservable latent character traits, including homosexuality, political ideology, and criminality, from photographs of human faces or other records of outward appearance,” write the authors, before giving a few recent examples. “In response, government and industry actors have adapted such methods into technologies deployed on the public in the form of products such as Faception, Hirevue, and Turnitin.”

Take, for instance, one recent study supporting the ability of algorithms to detect whether someone is homosexual simply by looking at their face. The reasoning in this approach is circular: the gender-prototypical facial morphology is predefined by the same measures as utilized in the original sexuality classification task, the authors of the new research explain.

The problem of cofounding factors is also well-illustrated with this example. The authors of the homosexuality physiognomy study told participants (college students) to keep their chin perfectly straight at 90 degrees. “It might be ventured that the average 19-year-old falls short of a perfectly calibrated proprioceptive sensibility of a 90 [degree] chin-to-body angle,” quip Mel Andrews, Andrew Smart, and Abeba Birhane, the authors of the new study.

Moreover, these technologies pose existential risks to marginalized communities. Studies claiming to identify sexual orientation or gender from facial features could be weaponized in regions where LGBTQ+ identities are criminalized. Despite claims by researchers that these tools are meant as warnings, their very existence enables opportunistic actors to exploit them.

AI-powered physiognomy can be very dangerous

Another example of how physiognomists attempted to define personal features based on facial features.

The pseudoscientific framework of physiognomy laid a foundation for systemic racism, influencing fields like anthropology, criminology, and eugenics. These biases were embedded into societal structures and used to justify slavery, segregation, and colonialism. While physiognomy has been debunked as junk science, its legacy persists — and this is apparent in the new wave of AI physiognomy studies.

We already know that AI datasets are often biased and this can induce misleading responses. The coupling of AI and physiognomy risks perpetuating and amplifying biases by using flawed and pseudoscientific principles to make judgments about individuals based on their appearance.

AI systems trained on biased datasets can inherit societal stereotypes, leading to discriminatory outcomes in areas like hiring, policing, and social services. For instance, it’s not hard to see how an algorithm claiming to detect whether someone is gay could be weaponized.

This issue is compounded by the culture of ML itself. A focus on rapid innovation, minimal peer review, and the commodification of research outputs creates an environment where flawed methodologies can thrive. Whereas most sciences emphasize demonstrable domain expertise and rigorous peer evaluation, ML research often bypasses these safeguards.

The reanimation of physiognomy serves as a cautionary tale of what happens when technology is wielded without accountability. It’s a reminder that progress in science is measured not just by what we can achieve, but by the wisdom with which we achieve it.

This is far from just a hypothetical problem. Its effects are already happening now.

“Authoritarian governments already actively use such technologies to suppress dissent and repress human rights,” the authors conclude.

share Share

After Charlie Kirk’s Murder, Americans Are Asking If Civil Discourse Is Even Possible Anymore

Trying to change someone’s mind can seem futile. But there are approaches to political discourse that still matter, even if they don’t instantly win someone over.

Climate Change May Have Killed More Than 16,000 People in Europe This Summer

Researchers warn that preventable heat-related deaths will continue to rise with continued fossil fuel emissions.

New research shows how Trump uses "strategic victimhood" to justify his politics

How victimhood rhetoric helped Donald Trump justify a sweeping global trade war

Biggest Modern Excavation in Tower of London Unearths the Stories of the Forgotten Inhabitants

As the dig deeper under the Tower of London they are unearthing as much history as stone.

Millions Of Users Are Turning To AI Jesus For Guidance And Experts Warn It Could Be Dangerous

AI chatbots posing as Jesus raise questions about profit, theology, and manipulation.

Can Giant Airbags Make Plane Crashes Survivable? Two Engineers Think So

Two young inventors designed an AI-powered system to cocoon planes before impact.

First Food to Boost Immunity: Why Blueberries Could Be Your Baby’s Best First Bite

Blueberries have the potential to give a sweet head start to your baby’s gut and immunity.

Ice Age People Used 32 Repeating Symbols in Caves Across the World. They May Reveal the First Steps Toward Writing

These simple dots and zigzags from 40,000 years ago may have been the world’s first symbols.

NASA Found Signs That Dwarf Planet Ceres May Have Once Supported Life

In its youth, the dwarf planet Ceres may have brewed a chemical banquet beneath its icy crust.

Nudists Are Furious Over Elon Musk's Plan to Expand SpaceX Launches in Florida -- And They're Fighting Back

A legal nude beach in Florida may become the latest casualty of the space race