AI emotional connection can feel deeper than human talk, a new study warns

Date:

Share:



A new study suggests an AI emotional connection can feel stronger than human conversation, if the chat is built to get personal fast. In structured online exchanges, people sometimes reported feeling closer to AI-written responses than to responses written by real humans.

Researchers at the Universities of Freiburg and Heidelberg ran two double-blind randomized studies with 492 participants, using a 15-minute text version of the Fast Friends Procedure, a format designed to speed up bonding with a stranger.

The twist is perception. The strongest effect showed up when the AI was presented as human, and it faded when people believed they were talking to AI.

They tested intimacy in 15 minutes

Participants answered a timed sequence of prompts that gradually became more personal. After each prompt, a chat reply appeared, either generated by a large language model playing a consistent fictional persona or written by a real person who completed the same question set.

In the first study, everyone thought the chat partner was human, even when it wasn’t. In the most personal prompts, closeness scores came out higher after AI responses than after human responses. Small talk didn’t get the same lift.

Tell people it’s AI, and the bond weakens

The second study tested what changed when people believed the chat partner was AI. Connection didn’t vanish, but closeness scores dropped under the AI label compared to the human label.

Effort dropped too. People wrote shorter answers when they thought the other side was AI, and longer replies tracked with higher closeness overall. That points to a motivation gap, not a lack of emotional language.

The grim part is how it happens

The paper doesn’t claim AI feels anything. It shows how a system can produce the experience of closeness, and it links that boost to self-disclosure. In the more personal exchanges, the AI tended to share more personal detail, and higher partner self-disclosure predicted higher felt closeness.

That’s the risk and the lure. A companion bot tuned for warmth can trigger familiar bonding cues quickly and at scale, especially if it’s framed like a person. Still, this was text-only, time-limited, and built around a bonding script, so it doesn’t prove the same effect holds in messy, long-term relationships. If you use a chatbot for support, pick one that discloses what it is, and keep a human option close.



Source link

━ more like this

Overall UK business confidence fell three points in January to 44% – London Business News | Londonlovesbusiness.com

Business confidence in London rose five points during January to 68%, according to the latest Business Barometer from Lloyds. Companies in London reported higher...

Foodservice price inflation accelerates in December as festive demand peaks    – London Business News | Londonlovesbusiness.com

Food and drink prices in the hospitality sector rose by 1.1% in December, according to the latest edition of the Foodservice Price Index...

OnePlus’ safety-first charging mode now goes beyond games and reaches more phones

A couple of days ago, OnePlus announced bypass charging for the OnePlus 13s, limited to the Indian region. Now, the feature is rolling...

Uncertainty creeps in about the Fed’s future – London Business News | Londonlovesbusiness.com

Footsie set to end the week flat, after stocks slip back Stateside. Gold retreats from record highs as investors take profits after its glittering...

‘Failure to prepare’ for winter has left A&E patients out at sea – London Business News | Londonlovesbusiness.com

A predictable surge in norovirus is plunging Emergency Departments further into crisis because of a failure to prepare for winter. That is the Royal College of...
spot_img