homehome Home chatchat Notifications


If you use ChatGPT a lot, this study has some concerning findings for you

So, umm, AI is not your friend — literally.

Mihai Andrei
March 28, 2025 @ 7:06 pm

share Share

a boy and a robot talking
AI-generated image.

Although AI only started permeating the online world a couple of years ago, billions of people are already using it. If you’re reading this, there’s a good chance you use ChatGPT for quick questions, emails, or creative brainstorming. But over the past two years, as the chatbot added features like a human-like voice and memory, researchers noticed that more and more people are treating it less like a tool and more like a companion.

In a joint study conducted by MIT and OpenAI scientists, researchers tackled an unpleasant question: Does spending time with a highly conversational AI make people feel emotionally attached — or even addicted?

So, you think ChatGPT is your friend?

The study was a tightly controlled 28-day experiment. Over the period, they analyzed over 40 million ChatGPT interactions and surveyed over 4,000 users. In parallel, nearly 1,000 participants took part in a randomized controlled trial (RCT), using ChatGPT daily for four weeks under various experimental conditions.

Across both studies, they found that a small percentage of users were responsible for a disproportionate amount of “affective use.” Affective use refers to chats marked by emotional content, intimacy sharing, and signs of dependency. The researchers performed an automated analysis on these conversations, using classifiers to flag conversations for emotional indicators. Although, they concede, these classifiers can lack nuance. They also tracked how often users activated these emotional cues over time.

If you’re wondering whether your regular use of ChatGPT means you’re on the slippery slope to AI addiction, you probably shouldn’t worry — most users aren’t showing signs of trouble. The majority engaged in neutral, task-oriented conversations. They saw ChatGPT as a helpful assistant, not a shoulder to cry on.

“Even among heavy users, high degrees of affective use are limited to a small group,” scientists write in a release on the study. These were the users who were most likely to agree with “I consider ChatGPT to be a friend.”

Who gravitates towards emotional use

The researchers also mention the usage of ChatGPT’s voice feature. One might assume that ChatGPT’s voice makes it more “addictive,” but the picture is more complicated.

In fact, users of voice mode (especially the engaging version) reported better emotional well-being when usage time was controlled. They were less lonely, less emotionally dependent, and less prone to problematic use compared to text-only users. But when usage time increased significantly, even voice-mode users began reporting worse outcomes.

This suggests a self-selection effect. People seeking emotional connection might naturally gravitate to voice chat, where responses feel more personal. But the technology itself isn’t inherently harmful. It’s the intensity of the engagement — and the person’s baseline mental state — that tip the scales.

In the big picture, it seems that people who start out lonelier are most likely to turn to AI for companionship. These people are more likely to develop what psychologists call a “parasocial relationship,” where someone forms a one-sided emotional bond with a media figure (or in this case, an AI). This type of relationship is extremely one-sided. And like parasocial bonds with influencers or fictional characters, these relationships can sometimes provide comfort — but they can also blur the lines between reality and simulation.

Not quite addiction

It’s not exactly addiction, but the researchers say it is “problematic use,” borrowing the term from behavioral psychology and digital media research. Users who engaged emotionally with ChatGPT showed decreased social interaction with others, higher emotional dependence, and increased feelings of loneliness (especially in those starting off lonely).

Are we headed towards a world where we start to consider algorithms our friends, or will we implement some helpful guardrails?

As always seems to be the case with AI, the challenge is huge. AI is getting more natural, more accessible, and more embedded in daily life. As it learns to mirror your tone, remember your preferences, and speak with human warmth, the temptation to lean on it emotionally will grow. So will the risk of crossing a line — from using a tool to needing a friend.

In the meantime, perhaps it would be useful to ask yourself a question when you’re using AI. Am I using this to get things done or to feel less alone?

You can read the entire report here.

share Share

This Superbug Learned How to Feed on Plastic from Hospitals

Hospitals might be unknowingly feeding their worst microbial enemies.

China's Tiangong space station has some bacteria that are unknown to science

These aren't the first bacteria to be discovered in space but they are particularly well-adapted for space station life.

Hidden Communication Devices Found in Chinese-Made Inverters Could Put U.S. Electrical Grid at Risk

U.S. experts uncover rogue communication devices inside solar inverters and batteries

Patients on Weight Loss Drugs Like Wegovy May Say They Just Don’t Want to Drink Anymore

Researchers discover semaglutide and liraglutide cut drinking by two-thirds in real-world trial

Why Some People Never Get Lost — and Others Always Do

It’s not really in your genes that much. It’s how you live, explore, and pay attention.

RFK Jr, Nation’s Top Health Official, Refuses to Recommend the Measles Vaccine, Says 'I Don’t Think People Should Be Taking Medical Advice from Me'

Health Secretary Robert F. Kennedy Jr. won’t say whether he’d vaccinate his kids today.

The key to healthy aging? Just eat different types of carbs

Fiber-rich, whole plant foods are the star of the show.

More People Are Dying from Broken Heart Syndrome Than Anyone Realized

New study finds 'broken heart syndrome' as fatal as it is misunderstood

Everything You Need to Know About Bird Flu

How dangerous is it? Where did it come from? H5N1 influenza’s origins stretch back to the 1990s, and key events paved the way for the outbreak we’re seeing today.

This beautiful rock holds evidence of tsunamis from 115 million years ago

The waves that shook the world 115 million years ago left behind an amber trail.