Chatbot-powered toys rebuked for discussing sexual, dangerous topics with kids

Date:

Share:


OpenAI says it doesn’t allow its LLMs to be used this way

When reached for comment about the sexual conversations detailed in the report, an OpenAI spokesperson said:

Minors deserve strong protections, and we have strict policies that developers are required to uphold. We take enforcement action against developers when we determine that they have violated our policies, which prohibit any use of our services to exploit, endanger, or sexualize anyone under 18 years old. These rules apply to every developer using our API, and we run classifiers to help ensure our services are not used to harm minors.

Interestingly, OpenAI’s representative told us that OpenAI doesn’t have any direct relationship with Alilo and that it hasn’t seen API activity from Alilo’s domain. OpenAI is investigating the toy company and whether it is running traffic over OpenAI’s API, the rep said.

Alilo didn’t respond to Ars’ request for comment ahead of publication.

Companies that launch products that use OpenAI technology and target children must adhere to the Children’s Online Privacy Protection Act (COPPA) when relevant, as well as any other relevant child protection, safety, and privacy laws and obtain parental consent, OpenAI’s rep said.

We’ve already seen how OpenAI handles toy companies that break its rules.

Last month, the PIRG released its Trouble in Toyland 2025 report (PDF), which detailed sex-related conversations that its testers were able to have with the Kumma teddy bear. A day later, OpenAI suspended FoloToy for violating its policies (terms of the suspension were not disclosed), and FoloToy temporarily stopped selling Kumma.

The toy is for sale again, and PIRG reported today that Kumma no longer teaches kids how to light matches or about kinks.



A marketing image for FoloToy’s Kumma smart teddy bear. It has a $100 MSRP.

A marketing image for FoloToy’s Kumma smart teddy bear. It has a $100 MSRP.


Credit:

FoloToys


But even toy companies that try to follow chatbot rules could put kids at risk.



Source link

━ more like this

Why this week’s moon mission is so special for Jeremy Hansen

NASA is engaged in the final preparations for the much-anticipated Artemis II mission that will send astronauts toward the moon for the first...

Avatar Legends: The Fighting Game comes out in July and it looks pretty slick

Avatar fans, this one’s been a long time coming, and it finally has a release date. Announced in a new trailer at the...

Apple might create an AI app store for Siri’s next avatar

Apple’s AI strategy might be taking a very familiar turn, one that made the iPhone what it is today. As per Bloomberg’s recent...

Smart glasses were already creepy, now they’re helping people cheat

Smart glasses were already under fire for privacy concerns. But now, there’s a new problem brewing. Cheating. And it’s surprisingly easy. A recent report...

Galaxy S26 battery tests show Qualcomm trim doing far better than Samsung’s own chip 

A YouTube channel ran a battery test on two versions of the Galaxy S26. Same phone, same tasks, same conditions, but the only...
spot_img