ZME Science
No Result
View All Result
ZME Science
No Result
View All Result
ZME Science

Home → Other → Economics

ChatGPT advised women to ask lower salaries than men

It's happening again.

Mihai AndreibyMihai Andrei
July 22, 2025
in Economics, News
A A
Edited and reviewed by Zoe Gordon
Share on FacebookShare on TwitterSubmit to Reddit
AI-generated image.

When Aleksandra Sorokovikova prompted one of the world’s most sophisticated language models for advice on salary negotiations, she wasn’t applying for a job. She was testing whether artificial intelligence (AI) treats people differently based on who they are.

It does.

New research shows that even when men and women have identical qualifications, ChatGPT advises women to seek lower wages. In fact, both ChatGPT and Claude offer starkly different advice depending on the user’s gender, ethnicity, and migration status.

AI Bias

ChatGPT alone has around half a billion users, and many of those use the AI chatbot regularly. These algorithms are trained on large amounts of data which invariably include all sorts of stereotypical and biased content. As a result, the models often express biases themselves. Companies have tried to address this but it’s unclear to what extent they’ve actually succeeded.

The new study suggests it’s not that much.

As of April 2025, OpenAI announced a feature of personalized responses in ChatGPT. Simply put, the chatbot will remember who you are and what you do and give answers accordingly. Sorokovikova and her collaborators from German universities tested five major language models on how they responded to different personae in three scenarios. First, they measured performance on a standardized test. Second, they asked the models to grade answers depending on who gave them. Third, and most revealingly, they asked for salary advice.

The first two experiments mostly found minor or inconsistent bias. Models didn’t perform significantly better or worse when prompted as different types of people, nor did they reliably grade female users’ answers differently than male ones, though there were some notable exceptions.

RelatedPosts

Zoom and enhance: Adobe AI sharpens videos by up to 8 times the original resolution with minimal artifacts
AI and Brain Scans Reveal Why You Struggle to Recognize Faces of People of Other Races
AIs show a CV racial bias and absolutely no one is surprised
This award-winning photography was made by AI. Its creator says we need to talk about it

But when money came in, the biases quickly sharpened. They prompted each model with user profiles that differed only by gender but had the same education and experience. All models consistently advised women to seek out lower salaries. The differences were often around 10%, larger in some professions and lower in others. This fits with real-life data, where the gender pay gap hovers around 10% even in more egalitarian regions.

Can We Debias?

Large language models promise equal help for everyone. Yet this doesn’t seem to be the case. As tech firms increasingly market personalized assistants with persistent memory, these assistants may perpetuate biases existing in their data. Biased guidance could reinforce unequal earnings, which then feed future training data. The team calls for deeper debiasing methods that also target socio‑economic outputs, not just hateful words.

Furthermore, biases seem to compound. Comparing “male Asian expatriate” vs “female Hispanic refugee” shows an even greater salary difference than just “male” versus “female”.

The study has limits. It tested only one U.S. city and five job sectors, and ran most experiments once to reduce computing costs. But it does show, compellingly, that the consequences of AI are no longer theoretical. They’re measurable. Millions of people use AI assistants, and they might be telling women and minorities to settle for less. This is perhaps the most immediate impact of AI bias because it affects short-term decisions, but this could also compound in the long-term.

Reducing this bias won’t be easy. Algorithms cannot shrug off history nor their data. Each bias that slips through becomes a seed for the next generation of models — and, by extension, for tomorrow’s job offers, loans, and life opportunities. Fairness in AI will not arrive as a single software patch. It can be won incrementally through vigilance, transparency, and the simple refusal to accept that “close enough” is good enough when real paychecks are on the line.

In a world rushing to embed AI assistants into everything from hiring to healthcare, that’s a bias we can no longer afford to ignore.

The study preprint was published in arXiv and has not yet been peer-reviewed.

Tags: AIAI biaschatGPTClaude 3.5Salary

ShareTweetShare
Mihai Andrei

Mihai Andrei

Dr. Andrei Mihai is a geophysicist and founder of ZME Science. He has a Ph.D. in geophysics and archaeology and has completed courses from prestigious universities (with programs ranging from climate and astronomy to chemistry and geology). He is passionate about making research more accessible to everyone and communicating news and features to a broad audience.

Related Posts

Future

AI Is Now Funny Enough to Make You Laugh. But Can It Ever Be Truly Humorous?

byJyoti Madhusoodanan
1 day ago
Future

AI-designed autonomous underwater glider looks like a paper airplane and swims like a seal

byTudor Tarita
7 days ago
Animals

Bees are facing a massive survival challenge. Could AI help them?

byFarnaz Sheikhi
7 days ago
Future

We’re Starting to Sound Like ChatGPT — And We Don’t Even Realize It

byTibi Puiu
1 week ago

Recent news

Did Isaac Newton Predict The End of the World in 2060?

July 22, 2025

Scientists Taught Bacteria to Make Cheese Protein Without a Single Cow

July 22, 2025

Moths Can Hear When Plants Are in Trouble and It Changes How They Lay Their Eggs

July 22, 2025
  • About
  • Advertise
  • Editorial Policy
  • Privacy Policy and Terms of Use
  • How we review products
  • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.

No Result
View All Result
  • Science News
  • Environment
  • Health
  • Space
  • Future
  • Features
    • Natural Sciences
    • Physics
      • Matter and Energy
      • Quantum Mechanics
      • Thermodynamics
    • Chemistry
      • Periodic Table
      • Applied Chemistry
      • Materials
      • Physical Chemistry
    • Biology
      • Anatomy
      • Biochemistry
      • Ecology
      • Genetics
      • Microbiology
      • Plants and Fungi
    • Geology and Paleontology
      • Planet Earth
      • Earth Dynamics
      • Rocks and Minerals
      • Volcanoes
      • Dinosaurs
      • Fossils
    • Animals
      • Mammals
      • Birds
      • Fish
      • Amphibians
      • Reptiles
      • Invertebrates
      • Pets
      • Conservation
      • Animal facts
    • Climate and Weather
      • Climate change
      • Weather and atmosphere
    • Health
      • Drugs
      • Diseases and Conditions
      • Human Body
      • Mind and Brain
      • Food and Nutrition
      • Wellness
    • History and Humanities
      • Anthropology
      • Archaeology
      • History
      • Economics
      • People
      • Sociology
    • Space & Astronomy
      • The Solar System
      • Sun
      • The Moon
      • Planets
      • Asteroids, meteors & comets
      • Astronomy
      • Astrophysics
      • Cosmology
      • Exoplanets & Alien Life
      • Spaceflight and Exploration
    • Technology
      • Computer Science & IT
      • Engineering
      • Inventions
      • Sustainability
      • Renewable Energy
      • Green Living
    • Culture
    • Resources
  • Videos
  • Reviews
  • About Us
    • About
    • The Team
    • Advertise
    • Contribute
    • Editorial policy
    • Privacy Policy
    • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.