AI emotional connection can feel deeper than human talk, a new study warns

Date:

Share:



A new study suggests an AI emotional connection can feel stronger than human conversation, if the chat is built to get personal fast. In structured online exchanges, people sometimes reported feeling closer to AI-written responses than to responses written by real humans.

Researchers at the Universities of Freiburg and Heidelberg ran two double-blind randomized studies with 492 participants, using a 15-minute text version of the Fast Friends Procedure, a format designed to speed up bonding with a stranger.

The twist is perception. The strongest effect showed up when the AI was presented as human, and it faded when people believed they were talking to AI.

They tested intimacy in 15 minutes

Participants answered a timed sequence of prompts that gradually became more personal. After each prompt, a chat reply appeared, either generated by a large language model playing a consistent fictional persona or written by a real person who completed the same question set.

In the first study, everyone thought the chat partner was human, even when it wasn’t. In the most personal prompts, closeness scores came out higher after AI responses than after human responses. Small talk didn’t get the same lift.

Tell people it’s AI, and the bond weakens

The second study tested what changed when people believed the chat partner was AI. Connection didn’t vanish, but closeness scores dropped under the AI label compared to the human label.

Effort dropped too. People wrote shorter answers when they thought the other side was AI, and longer replies tracked with higher closeness overall. That points to a motivation gap, not a lack of emotional language.

The grim part is how it happens

The paper doesn’t claim AI feels anything. It shows how a system can produce the experience of closeness, and it links that boost to self-disclosure. In the more personal exchanges, the AI tended to share more personal detail, and higher partner self-disclosure predicted higher felt closeness.

That’s the risk and the lure. A companion bot tuned for warmth can trigger familiar bonding cues quickly and at scale, especially if it’s framed like a person. Still, this was text-only, time-limited, and built around a bonding script, so it doesn’t prove the same effect holds in messy, long-term relationships. If you use a chatbot for support, pick one that discloses what it is, and keep a human option close.



Source link

━ more like this

It’s not just Grok: Apple and Google app stores are infested with nudifying AI apps

We tend to think of the Apple App Store and Google Play Store as digital “walled gardens” – safe, curated spaces where dangerous...

Sennheiser’s new audio gear keeps the wire and a budget appeal

Sennheiser has just dropped a lifeline to everyone who misses the simplicity of plugging in a pair of headphones and hitting play. In...

Agentic AI in Retail 2026: The Playbook for Scalable Impact – Insights Success

For brands and retailers, success is not just about executing assortments or managing seasonal demand. It’s about making the correct decisions quicker and...

The Complete Guide to Custom Shopping Bag Materials – Insights Success

The material you use for your unique shopping bags has a big impact on how people see your brand. The material affects durability,...

NASA animation shows exactly how its crewed moon mission will unfold

A NASA video (above) reveals in great detail how its upcoming Artemis II mission is expected to play out. The space agency released the...
spot_img