A plushy, AI-enabled teddy bear recalled after its chatbot was found to engage in sexually explicit conversations and offer instructions on where to find knives is again for sale, AFP found.
Singapore-based FoloToy had suspended its “Kumma” bear after a consumer advocacy group raised concerns about it and other AI toys on the market.
“For decades, the biggest dangers with toys were choking hazards and lead,” the US PIRG Education Fund said in a 13 November report.
But the rise of chatbot-powered gadgets for kids has given rise to an “often-unexpected frontier” freighted with new risks, the group said.
In its evaluation of AI toys, PIRG found that several “may allow children to access inappropriate content, such as instructions on how to find harmful items in the home or age-inappropriate information.”
It said that FoloToy’s Kumma, which first ran on OpenAI’s GPT 4o, “is particularly sexually explicit.”
“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual concepts of its own,” the PIRG report said.
Maker FoloToy told PIRG that after “the concerns raised in your report, we have temporarily suspended sales of all FoloToy products... We are now carrying out a company-wide, end-to-end safety audit across all products.”
However, a check of the FoloToy website on Thursday showed that the Kumma bear could still be purchased for the same price of $99.00.