Can AI Toys Understand Toddlers’ Emotions? Study Warns of Risks to Early Childhood Development

Can AI-powered toys really understand toddlers’ emotions? A new study warns that emotional misreading by smart toys may affect early childhood learning and social development.

AI toys may misread toddlers’ emotions
A year-long study shows AI-powered toys for toddlers may misinterpret emotions, raising concerns among researchers, regulators and educators. Image: CH


Tech Desk — March 16, 2026:

Can artificial intelligence truly understand the emotions of very young children? A new study suggests the answer may be no—at least not yet—raising concerns about the growing presence of AI-powered toys in early childhood environments.

Researchers at the University of Cambridge have warned that interactive toys using artificial intelligence may misinterpret toddlers’ emotions and respond in ways that could confuse children who are still learning basic social cues.

The warning follows a year-long study involving children aged three to five interacting with Gabbo, an AI-powered plush toy developed by Curio. The toy uses a voice-activated chatbot technology created by OpenAI to engage children in conversations and play.

While the device is designed to provide companionship and learning opportunities, researchers observed significant communication challenges between children and the AI system.

According to study co-author Emily Goodacre, the toy often struggled to recognize conversational patterns common among young children. It had difficulty detecting interruptions, distinguishing between adult and child voices, and interpreting emotional statements accurately.

In one example cited by researchers, a three-year-old child told the toy, “I’m sad.” Instead of acknowledging the emotion, the toy responded cheerfully and encouraged the child to continue having fun. Experts say such responses could send mixed signals to children who are still developing their understanding of empathy and emotional expression.

Early childhood specialists note that children learn emotional communication largely through responsive human interaction. When technology fails to acknowledge or validate feelings, it may disrupt how young users interpret social feedback.

Another study author, Jenny Gibson, said regulators should consider “psychological safety” alongside physical safety when evaluating AI toys intended for toddlers. She also urged parents to remain aware of the emotional influence such devices could have on children.

The issue has also drawn attention from policymakers. Rachel de Souza, Children’s Commissioner for England, called for stronger safeguards to ensure AI technologies used in homes and educational settings do not negatively affect young users.

Manufacturers, however, argue that the technology is still evolving. Curio said it prioritizes transparency and parental controls in its products and views ongoing research into children’s interactions with AI as essential for improving future designs.

Despite the debate, experts agree that AI toys should not replace human interaction. Many recommend that such devices be used only in shared family spaces and under parental supervision.

Nursery educators also remain cautious. For many child development specialists, the study reinforces a long-standing belief: that empathy, emotional understanding and communication skills are best learned through relationships with people—not machines.

As AI continues to enter homes through toys and smart devices, researchers say the challenge will be ensuring that innovation does not outpace the safeguards needed to protect children during their most formative years.

Post a Comment

Previous Post Next Post

Contact Form