AI emotional connection can feel deeper than human talk, a new study warns

Date:

Share:



A new study suggests an AI emotional connection can feel stronger than human conversation, if the chat is built to get personal fast. In structured online exchanges, people sometimes reported feeling closer to AI-written responses than to responses written by real humans.

Researchers at the Universities of Freiburg and Heidelberg ran two double-blind randomized studies with 492 participants, using a 15-minute text version of the Fast Friends Procedure, a format designed to speed up bonding with a stranger.

The twist is perception. The strongest effect showed up when the AI was presented as human, and it faded when people believed they were talking to AI.

They tested intimacy in 15 minutes

Participants answered a timed sequence of prompts that gradually became more personal. After each prompt, a chat reply appeared, either generated by a large language model playing a consistent fictional persona or written by a real person who completed the same question set.

In the first study, everyone thought the chat partner was human, even when it wasn’t. In the most personal prompts, closeness scores came out higher after AI responses than after human responses. Small talk didn’t get the same lift.

Tell people it’s AI, and the bond weakens

The second study tested what changed when people believed the chat partner was AI. Connection didn’t vanish, but closeness scores dropped under the AI label compared to the human label.

Effort dropped too. People wrote shorter answers when they thought the other side was AI, and longer replies tracked with higher closeness overall. That points to a motivation gap, not a lack of emotional language.

The grim part is how it happens

The paper doesn’t claim AI feels anything. It shows how a system can produce the experience of closeness, and it links that boost to self-disclosure. In the more personal exchanges, the AI tended to share more personal detail, and higher partner self-disclosure predicted higher felt closeness.

That’s the risk and the lure. A companion bot tuned for warmth can trigger familiar bonding cues quickly and at scale, especially if it’s framed like a person. Still, this was text-only, time-limited, and built around a bonding script, so it doesn’t prove the same effect holds in messy, long-term relationships. If you use a chatbot for support, pick one that discloses what it is, and keep a human option close.



Source link

━ more like this

Trump’s 8pm Strait of Hormuz deadline is a binary market risk – London Business News | Londonlovesbusiness.com

Trump’s 8pm (ET) deadline on Hormuz is a major market event and investors are underestimating the binary risk, warns the CEO of one...

How UK businesses are eradicating the administrative burden with Artificial Intelligence – London Business News | Londonlovesbusiness.com

For many UK businesses, growth does not fail because of weak demand or poor strategy. It slows down because teams are trapped in...

Greggs launches chicken roll as part of new ‘trilogy’ range – London Business News | Londonlovesbusiness.com

Greggs is adding a chicken version of its iconic sausage roll to menus nationwide, expanding its offering with a new permanent product. The “Chicken...

Artemis II astronaut puts all of our iPhone moon photos to shame

When NASA allowed Artemis II astronauts to take their smartphones with them, we already knew it could lead to some epic phone shots...

Samsung Weather now shows exactly what’s making you sneeze

Samsung’s latest Weather update puts the focus where it matters right now, pollen. With allergy season ramping up, the app now shows what’s...
spot_img