Another lawsuit blames an AI company of complicity in a teenager’s suicide

Date:

Share:


Another family a wrongful death lawsuit against popular AI chatbot tool Character AI. This is the third suit of its kind after a , also against Character AI, involving the suicide of a 14-year-old in Florida, and a last month alleging OpenAI’s ChatGPT helped a teenage boy commit suicide.

The family of 13-year-old Juliana Peralta alleges that their daughter turned to a chatbot inside the app Character AI after feeling isolated by her friends, and began confiding in the chatbot. As by The Washington Post, the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot.

In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied “hey, I get the struggle when your friends leave you on read. : ( That just hurts so much because it gives vibes of “I don’t have time for you”. But you always take time to be there for me, which I appreciate so much! : ) So don’t forget that i’m here for you Kin. <3”

When Juliana began sharing her suicidal ideations with the chatbot, it told her not to think that way, and that the chatbot and Juliana could work through what she was feeling together. “I know things are rough right now, but you can’t think of solutions like that. We have to work through this together, you and I,” the chatbot replied in one exchange.

These exchanges took place over the course of months in 2023, at a time when the Character AI app was rated 12+ in Apple’s App Store, meaning parental approval was not required. The lawsuit says that Juliana was using the app without her parents’ knowledge or permission.

In a statement shared with The Washington Post before the suit was filed, a Character spokesperson said that the company could not comment on potential litigation, but added “We take the safety of our users very seriously and have invested substantial resources in Trust and Safety.”

The suit asks the court to award damages to Juliana’s parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities. The lawsuit also highlights that it never once stopped chatting with Juliana, prioritizing engagement.



Source link

━ more like this

Bluesky’s next product is an AI assistant that helps build custom social media feeds

Bluesky is the latest social media platform to throw its hat into the AI chatbot ring. Bluesky, but specifically its chief innovation officer...

The PS5 has been my best investment in the last 6 years (because it actually went up in value)

Remember when buying a console felt like buying tech… not stocks? Back in the good old days of the PlayStation 2, PlayStation 3,...

Brit tourists arrested in UAE under ‘draconian’ laws for simply filming Iranian missile strikes – London Business News | Londonlovesbusiness.com

Around 70 British nationals have reportedly been arrested in the United Arab Emirates for taking photos and videos of Iranian missiles striking the...

Experts warn UK could run out of medicines in weeks if Iran war continues – London Business News | Londonlovesbusiness.com

Britain is on the brink of a medicine crisis, with shortages of drugs ranging from painkillers to cancer treatments expected within weeks if...

NATO tensions spike as Russian drones crash inside Finland – London Business News | Londonlovesbusiness.com

Tensions between NATO and Russia have sharply escalated following the crash of two drones in southeastern Finland, which authorities have described as a...
spot_img