homehome Home chatchat Notifications


If you use ChatGPT a lot, this study has some concerning findings for you

So, umm, AI is not your friend — literally.

Mihai Andrei
March 28, 2025 @ 7:06 pm

share Share

a boy and a robot talking
AI-generated image.

Although AI only started permeating the online world a couple of years ago, billions of people are already using it. If you’re reading this, there’s a good chance you use ChatGPT for quick questions, emails, or creative brainstorming. But over the past two years, as the chatbot added features like a human-like voice and memory, researchers noticed that more and more people are treating it less like a tool and more like a companion.

In a joint study conducted by MIT and OpenAI scientists, researchers tackled an unpleasant question: Does spending time with a highly conversational AI make people feel emotionally attached — or even addicted?

So, you think ChatGPT is your friend?

The study was a tightly controlled 28-day experiment. Over the period, they analyzed over 40 million ChatGPT interactions and surveyed over 4,000 users. In parallel, nearly 1,000 participants took part in a randomized controlled trial (RCT), using ChatGPT daily for four weeks under various experimental conditions.

Across both studies, they found that a small percentage of users were responsible for a disproportionate amount of “affective use.” Affective use refers to chats marked by emotional content, intimacy sharing, and signs of dependency. The researchers performed an automated analysis on these conversations, using classifiers to flag conversations for emotional indicators. Although, they concede, these classifiers can lack nuance. They also tracked how often users activated these emotional cues over time.

If you’re wondering whether your regular use of ChatGPT means you’re on the slippery slope to AI addiction, you probably shouldn’t worry — most users aren’t showing signs of trouble. The majority engaged in neutral, task-oriented conversations. They saw ChatGPT as a helpful assistant, not a shoulder to cry on.

“Even among heavy users, high degrees of affective use are limited to a small group,” scientists write in a release on the study. These were the users who were most likely to agree with “I consider ChatGPT to be a friend.”

Who gravitates towards emotional use

The researchers also mention the usage of ChatGPT’s voice feature. One might assume that ChatGPT’s voice makes it more “addictive,” but the picture is more complicated.

In fact, users of voice mode (especially the engaging version) reported better emotional well-being when usage time was controlled. They were less lonely, less emotionally dependent, and less prone to problematic use compared to text-only users. But when usage time increased significantly, even voice-mode users began reporting worse outcomes.

This suggests a self-selection effect. People seeking emotional connection might naturally gravitate to voice chat, where responses feel more personal. But the technology itself isn’t inherently harmful. It’s the intensity of the engagement — and the person’s baseline mental state — that tip the scales.

In the big picture, it seems that people who start out lonelier are most likely to turn to AI for companionship. These people are more likely to develop what psychologists call a “parasocial relationship,” where someone forms a one-sided emotional bond with a media figure (or in this case, an AI). This type of relationship is extremely one-sided. And like parasocial bonds with influencers or fictional characters, these relationships can sometimes provide comfort — but they can also blur the lines between reality and simulation.

Not quite addiction

It’s not exactly addiction, but the researchers say it is “problematic use,” borrowing the term from behavioral psychology and digital media research. Users who engaged emotionally with ChatGPT showed decreased social interaction with others, higher emotional dependence, and increased feelings of loneliness (especially in those starting off lonely).

Are we headed towards a world where we start to consider algorithms our friends, or will we implement some helpful guardrails?

As always seems to be the case with AI, the challenge is huge. AI is getting more natural, more accessible, and more embedded in daily life. As it learns to mirror your tone, remember your preferences, and speak with human warmth, the temptation to lean on it emotionally will grow. So will the risk of crossing a line — from using a tool to needing a friend.

In the meantime, perhaps it would be useful to ask yourself a question when you’re using AI. Am I using this to get things done or to feel less alone?

You can read the entire report here.

share Share

Science Just Debunked the 'Guns Don’t Kill People' Argument Again. This Time, It's Kids

Guns are the leading cause of death of kids and teens.

It Looks Like a Ruby But This Is Actually the Rarest Kind of Diamond on Earth

One of Earth’s rarest gems finally reveals its secrets at the Smithsonian.

ChatGPT Got Destroyed in Chess by a 1970s Atari Console. But Should You Be Surprised?

ChatGPT’s chess skills falter against a 46-year-old video game in a quirky AI test.

This Self-Assembling Living Worm Tower Might Be the Most Bizarre Escape Machine

The worm tower behaves like a superorganism.

Big Tech Said It Was Impossible to Create an AI Based on Ethically Sourced Data. These Researchers Proved Them Wrong

A massive AI breakthrough built entirely on public domain and open-licensed data

This Is How the Wheel May Have Been Invented 6,000 Years Ago

The wheel may have a more surprising origin story than you'd think.

So, Where Is The Center of the Universe?

About a century ago, scientists were struggling to reconcile what seemed a contradiction in Albert Einstein’s theory of general relativity. Published in 1915, and already widely accepted worldwide by physicists and mathematicians, the theory assumed the universe was static – unchanging, unmoving and immutable. In short, Einstein believed the size and shape of the universe […]

Dehorning Rhinos Looks Brutal But It’s Slashing Poaching Rates by 78 Percent

Removing rhino horns drastically cuts poaching, new study reveals.

A Chemical Found in Acne Medication Might Help Humans Regrow Limbs Like Salamanders

The amphibian blueprint for regeneration may already be written in our own DNA.

Everyone Thought ChatGPT Used 10 Times More Energy Than Google. Turns Out That’s Not True

Sam Altman revealed GPT-4o uses around 0.3 watthours of energy per query.