AI emotional connection can feel deeper than human talk, a new study warns

Date:

Share:



A new study suggests an AI emotional connection can feel stronger than human conversation, if the chat is built to get personal fast. In structured online exchanges, people sometimes reported feeling closer to AI-written responses than to responses written by real humans.

Researchers at the Universities of Freiburg and Heidelberg ran two double-blind randomized studies with 492 participants, using a 15-minute text version of the Fast Friends Procedure, a format designed to speed up bonding with a stranger.

The twist is perception. The strongest effect showed up when the AI was presented as human, and it faded when people believed they were talking to AI.

They tested intimacy in 15 minutes

Participants answered a timed sequence of prompts that gradually became more personal. After each prompt, a chat reply appeared, either generated by a large language model playing a consistent fictional persona or written by a real person who completed the same question set.

In the first study, everyone thought the chat partner was human, even when it wasn’t. In the most personal prompts, closeness scores came out higher after AI responses than after human responses. Small talk didn’t get the same lift.

Tell people it’s AI, and the bond weakens

The second study tested what changed when people believed the chat partner was AI. Connection didn’t vanish, but closeness scores dropped under the AI label compared to the human label.

Effort dropped too. People wrote shorter answers when they thought the other side was AI, and longer replies tracked with higher closeness overall. That points to a motivation gap, not a lack of emotional language.

The grim part is how it happens

The paper doesn’t claim AI feels anything. It shows how a system can produce the experience of closeness, and it links that boost to self-disclosure. In the more personal exchanges, the AI tended to share more personal detail, and higher partner self-disclosure predicted higher felt closeness.

That’s the risk and the lure. A companion bot tuned for warmth can trigger familiar bonding cues quickly and at scale, especially if it’s framed like a person. Still, this was text-only, time-limited, and built around a bonding script, so it doesn’t prove the same effect holds in messy, long-term relationships. If you use a chatbot for support, pick one that discloses what it is, and keep a human option close.



Source link

━ more like this

Look Outside’s April 1 update that let you kiss enemies is now a permanent ‘smooch mode’

For April Fools' Day, the developer of Look Outside released an update that added a new option to your interactions with NPCs: kissing....

Sony’s gaming division just bought an AI startup that turns photos into 3D volumes

Sony Interactive Entertainment, owner of the PlayStation brand, has acquired Cinemersive Labs, a UK startup developing tools to convert 2D photos and videos...

Meta’s AI smart glasses have a creepy reputation, but they are finding a good purpose too

Meta’s Ray-Ban smart glasses have earned a reputation for being creepy, with growing backlash over reports of users secretly recording people in public....

VR game Teenage Mutant Ninja Turtles: Empire City launches on April 30

Everyone's four favorite anthropomorphic turtles are returning to the world of video games. Teenage Mutant Ninja Turtles: Empire City will be released on...

‘Uncanny Valley’: Iran’s Threats on US Tech, Trump’s Plans for Midterms, and Polymarket’s Pop-up Flop

Kate Knibbs: So, you went twice?Makena Kelly: Yes, Kate. I went twice.Kate Knibbs: I missed that.Zoë Schiffer: Wait, is the Pentagon Pizza thing...
spot_img