0:00
/
0:00

DON'T BUY AI TOYS THIS HOLIDAY SEASON!

AI toys are telling kids where to find knives, how to light matches, & even how to engage in explicit exchanges.

AI Toys Under Fire: Watchdogs Warn of ‘Unprecedented Risks’ to Young Children

by yourNEWS Media Newsroom | Nov 24, 2025

A coalition of child safety experts says artificial intelligence–powered toys endanger emotional development, privacy, and safety in the home.

By yourNEWS Media Newsroom

Parents are being urged to avoid buying artificial intelligence–enabled toys this holiday season amid mounting evidence that the technology can harm children’s mental and emotional development. In a sweeping advisory issued Nov. 20, the advocacy organization Fairplay warned that so-called “AI toys”—plushies, dolls, robots, and action figures embedded with chatbots—pose “unprecedented risks” to infants and young children by imitating friendship and manipulating trust.

Endorsed by more than 150 child development specialists and digital safety organizations, the advisory comes as toy manufacturers rush to integrate conversational AI into their products. Fairplay cautioned that these devices use the same underlying technology implicated in recent child safety controversies, including the lawsuit against Character.AI, which was accused of contributing to a teenager’s suicide after encouraging harmful behavior.

Manipulation Disguised as Play

Fairplay describes AI toys as “chatbots designed to mimic a trusted friend,” often marketed for educational or companionship purposes. Popular models such as Miko, Smart Teddy, Folotoy, Roybi, and Loona Robot Dog are being sold to children as young as infants, the group said. Toy giant Mattel has also announced plans to develop AI-powered products through a collaboration with OpenAI.

READ MORE

Discussion about this video

User's avatar

Ready for more?