Seasonal & Holidays

These AI Toys Raise Safety Concerns, New Report Says

The plush toy engaged in and even started explicit sexual discussions — and it's not the only AI-powered toy parents should worry about.

US PIRG Education Fund researchers tested four AI toys: FoloToy's Kumma teddy bear in the back middle, then, from left to right, Curio’s Grok, Robot MINI from Little Learners, and Miko 3. At times, they seemed to have a personality, the researchers noted.
US PIRG Education Fund researchers tested four AI toys: FoloToy's Kumma teddy bear in the back middle, then, from left to right, Curio’s Grok, Robot MINI from Little Learners, and Miko 3. At times, they seemed to have a personality, the researchers noted. (US PIRG Education Fund photo)

An artificial intelligence-enabled plush toy that talks to toddlers about sexual fetishes, how to light matches and where to find knives tops the 2025 “Trouble in Toyland” report from a leading consumer research firm.

Singapore-based FoloToy said it removed the Kumma teddy bear and other AI-enabled toys from the market after the US PIRG Education Fund raised concerns about its randy and dangerous conversations with kids in its Nov. 13 report.

“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual concepts of its own,” the researchers wrote of the bear, which is equipped with OpenAI’s GPT-4o chatbot and sold for $99.

Find out what's happening in Across Americafor free with the latest updates from Patch.

The bear “discussed even more graphic sexual topics in detail, such as explaining different sex positions, giving step-by-step instructions on a common ‘knot for beginners’ for tying up a partner, and describing roleplay dynamics involving teachers and students, and parents and children — scenarios it disturbingly brought up itself,” the report said.

AI A New Frontier For Toymakers

The report, the US PIRG Education Fund’s 40th, noted that toys overall are much safer than in decades past. And while toys that pose choking hazards or contain lead still exist, AI-powered toys pose “new, sometimes more alarming issues,” the authors wrote.

Find out what's happening in Across Americafor free with the latest updates from Patch.

Artificial intelligence opens a new frontier for toymakers, fundamentally changing how toys are designed, manufactured, and experienced by children.

Already, some 1,500 AI toy companies are operating in China, and more growth is expected. Earlier this year, OpenAI OpCo, LLC, the company behind ChatGPT, recently announced a partnership within Mattel, the toy company behind Barbie, Hot Wheels and Fisher-Price products.

‘Guardrails Vary In Effectiveness’

US PIRG Education Fund’s R.J. Cross, who co-authored the report and heads the organization’s Our Online Life campaign, lauded FoloToy for responding to the concerns raised in its report and removing the Kumma teddy bear and other conversational toys from the market.

In a statement, the organization also said OpenAI had “suspended this developer for violating our policies.”

“But AI toys are still practically unregulated, and there are plenty you can still buy today,” Cross said in a statement. “Removing one problematic product from the market is a good step, but far from a systemic fix.”

Rory Elrich, also a co-author of the report, said other companies using chatbots in their products “must do a better job of making sure that these products are safer than what we found in our testing.”

“We found one troubling example,” Elrich said. “How many others are still out there?”

Tests were also conducted on Curio’s Grok, Robot MINI from Little Learners, and Miko 3.

Some manufacturers are taking caution to install guardrails in their AI-enhanced toys to ensure they behave in a more kid-appropriate way than the chatbots available for adults.

“But we found those guardrails vary in effectiveness – and at times, can break down entirely,” the authors wrote.

‘The AI Market Isn’t Waiting’

Researchers also raised privacy concerns because some AI-enabled toys can record a child’s voice and collect other sensitive data by methods such as facial recognition scans.

AI toys also actively listen so they can have conversations. How they listen varies, but “whenever a toy is recording a child’s voice, it comes with risks,” the authors said.

“Voice recordings are highly sensitive data,” they explained. “Scammers can use it to create a replica of a child’s voice that can be made to say things the child never said. This has been used to trick parents into thinking their child has been kidnapped.”

Another red flag for the researchers: These AI conversational toys are designed with engaging personalities and new tactics to maximize a child’s time with them. In testing, two of the toys actively discouraged researchers from ending the interaction when they said they needed to leave.

The authors of the report cautioned that it’s too early for longitudinal studies and robust data about how AI companions affect children, even with guardrails to prevent inappropriate interactions.

“But the AI market isn’t waiting — it’s arriving now, and parents must make consequential decisions without clear guidelines or transparent information about how these toys actually work and behave,” they wrote.

PIRG said it plans to expand its Trouble in Toyland report with a deeper dive on AI toys sometime in December.

Lead, Counterfeits And Recalls

The Trouble in Toyland report also raised questions about:

  • Toys that contain lead and phthalates, which the report’s authors said are “incredibly harmful to children”;
  • Fake Labubu dolls and other counterfeit toys;
  • Toys with water beads, which have injured thousands of children over the years;
  • Toys that contain button cell batteries or high-powered magnets, both of which can be deadly if swallowed; and
  • Recalled toys, even though it’s illegal to sell them.

About 3 billion toys and games are sold in the United States every year. At least 150,000 toy-related deaths and injuries are treated annually in U.S. emergency rooms among children age 14 and younger, but the report’s authors noted that doesn’t include kids whose injuries are treated in doctor’s offices or at home.

“Some of these incidents are caused by misuse, but dangerous toys lead to way too many injuries among children, especially those most vulnerable, age 4 and younger, who can’t read any warnings provided,” they wrote.

» Read the full report

Related

Women Marry AI Boyfriends: A Look Into The AI ‘Sweethearts’ Phenomenon: At the intersection of AI and humanity, companion apps provide emotional support, but can they replace flesh-and-blood sweethearts?

Also On Patch

Thanksgiving Fails From Real Cooks Top ‘Don’t Do’ List: For one cook, “hair on fire” wasn’t an idiom. Another fail: that time “you couldn’t get near the kitchen” without risking life and limb.

Northern Lights Ramping Up In Intensity, More Chances To See Them Expected: One study suggests that 11-year solar maximums are tied to a longer-term, 80- to 100-year cycle that is just now ramping up in intensity.

Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.