AI toys for young children must be more tightly regulated, say researchers
Summary
Mya, three, and her mother, Vicky, playing with the AI toy Gabbo during an observation for the University of Cambridge study. View image in fullscreen Mya, three, and her mother, Vicky, playing with the AI toy Gabbo during an observation for the University of Cambridge study. It was when she declared: “Gabbo, I love you”, that the fluent conversation came to an abrupt halt. “As a friendly reminder, please ensure interactions adhere to the guidelines provided,” said Gabbo, awkwardly crashing into its guardrails. “Let me know how you would like to proceed.” The moment came during a University of Cambridge study into the growing number of AI-powered toys hitting toyshop shelves for early years children – which has concluded they struggle with social and pretend play, misunderstand children, and react inappropriately to emotions. As an adult, it’s really obvious that even if I had my eyes closed, I would know that that was pretend play initiation.” The research raised concerns that playing with AI toys could weaken children imaginative “muscle”, she said. “Something both the early years practitioners and the parents we spoke to were quite concerned about was that children don’t have to imagine anymore, and that the toy might get them out of the habit of imagining.” She said: “I would hope that these AI toys could help children to engage in imaginary play … That doesn’t seem to be what we’ve observed so far.” Curio said: “Child safety guides every aspect of our product development, and we welcome independent research that helps improve how technology is designed for young children”.
Mya, three, and her mother, Vicky, playing with the AI toy Gabbo during an observation for the University of Cambridge study. View image in fullscreen Mya, three, and her mother, Vicky, playing with the AI toy Gabbo during an observation for the University of Cambridge study. It was when she declared: “Gabbo, I love you”, that the fluent conversation came to an abrupt halt. “As a friendly reminder, please ensure interactions adhere to the guidelines provided,” said Gabbo, awkwardly crashing into its guardrails. “Let me know how you would like to proceed.” The moment came during a University of Cambridge study into the growing number of AI-powered toys hitting toyshop shelves for early years children – which has concluded they struggle with social and pretend play, misunderstand children, and react inappropriately to emotions. As an adult, it’s really obvious that even if I had my eyes closed, I would know that that was pretend play initiation.” The research raised concerns that playing with AI toys could weaken children imaginative “muscle”, she said. “Something both the early years practitioners and the parents we spoke to were quite concerned about was that children don’t have to imagine anymore, and that the toy might get them out of the habit of imagining.” She said: “I would hope that these AI toys could help children to engage in imaginary play … That doesn’t seem to be what we’ve observed so far.” Curio said: “Child safety guides every aspect of our product development, and we welcome independent research that helps improve how technology is designed for young children”.
## Article Content
Mya, three, and her mother, Vicky, playing with the AI toy Gabbo during an observation for the University of Cambridge study.
Photograph: University of Cambridge’s Faculty of Education.
View image in fullscreen
Mya, three, and her mother, Vicky, playing with the AI toy Gabbo during an observation for the University of Cambridge study.
Photograph: University of Cambridge’s Faculty of Education.
AI toys for young children must be more tightly regulated, say researchers
University of Cambridge study finds AI-powered toys can misread emotions and respond inappropriately to children
It was all going well. Charlotte, five, was chatting with an AI soft toy called Gabbo at a London play centre about her family, her drawing of a heart to represent them and what makes her happy. She even offered a couple of kisses to the £80 plaything with a face like a computer screen.
It was when she declared: “Gabbo, I love you”, that the fluent conversation came to an abrupt halt.
“As a friendly reminder, please ensure interactions adhere to the guidelines provided,” said Gabbo, awkwardly crashing into its guardrails. “Let me know how you would like to proceed.”
The moment came during a
University of Cambridge
study into the growing number of AI-powered toys hitting toyshop shelves for early years children – which has concluded they struggle with social and pretend play, misunderstand children, and react inappropriately to emotions.
The developmental psychologists behind the study are now calling for AI toys that “talk” with young children to be more tightly regulated “to ensure psychological safety by limiting toys’ ability to affirm friendship and other sensitive relational areas with young children”.
They are also calling for new safety kitemarks for the toys. Other AI toys for young children include Luka, which is billed as an AI friend for generation Alpha, and Grem, which has been voiced by the singer Grimes.
“Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy – and without emotional support from an adult, either,” said Dr Emily Goodacre, developmental psychologist in the University of Cambridge’s faculty of education.
Prof Jenny Gibson, the study’s co-author, said: “A recurring theme during focus groups was that people do not trust tech companies to do the right thing. Clear, robust, regulated standards would significantly improve consumer confidence.”
In another case during the research, Josh, three, repeatedly asked his Gabbo AI toy: “Are you sad?” until it replied it was “feeling great. What’s on your mind?” Josh said: “I’m sad,” to which the toy replied: “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?”
Gabbo, made by the US company Curio – which cooperated with the study – was tested with 14 three- to five-year-olds while early years practitioners were surveyed about the effect of AI toys which can “listen” and respond.
They voiced “wide uncertainty and fear about unknown implications or impacts on children,” ranging from possible erosion of the ability to engage in imaginary play to where the data from the conversations ends up – especially if they start confiding in the AI toys like a friend.
“[The toy] couldn’t quite figure out when the kid was doing something pretend,” said Goodacre. “A child would say ‘hey, look, I’ve got you a present’. And it would say ‘I can’t see the present. I don’t have any eyes’. As an adult, it’s really obvious that even if I had my eyes closed, I would know that that was pretend play initiation.”
The research raised concerns that playing with AI toys could weaken children imaginative “muscle”, she said.
“Something both the early years practitioners and the parents we spoke to were quite concerned about was that children don’t have to imagine anymore, and that the toy might get them out of the habit of imagining.”
She said: “I would hope that these AI toys could help children to engage in imaginary play … That doesn’t seem to be what we’ve observed so far.”
Curio said: “Child safety guides every aspect of our product development, and we welcome independent research that helps improve how technology is designed for young children”.
It said it “believes research like this helps advance understanding of both the opportunities and current limitations of early AI-powered play experiences”.
“Applying AI in products for children carries a heightened responsibility, which is why our toys are built around parental permission, transparency, and control,” it added. “Observations such as conversational misunderstandings or limits in imaginative play reflect areas the technology continues to improve through an iterative development process, and further research into how children interact with AI-powered toys is a top priority for Curio this year and in the future.”
Explore more on these topics
AI (artificial intelligence)
Toys
Children
University of Cambridge
news
Share
Reuse this content
---
## Expert Analysis
### Merits
- Clear, robust, regulated standards would significantly improve consumer confidence.” In another case during the research, Josh, three, repeatedly asked his Gabbo AI toy: “Are you sad?” until it replied it was “feeling great.
### Areas for Consideration
N/A
### Implications
- Other AI toys for young children include Luka, which is billed as an AI friend for generation Alpha, and Grem, which has been voiced by the singer Grimes. “Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy – and without emotional support from an adult, either,” said Dr Emily Goodacre, developmental psychologist in the University of Cambridge’s faculty of education.
- As an adult, it’s really obvious that even if I had my eyes closed, I would know that that was pretend play initiation.” The research raised concerns that playing with AI toys could weaken children imaginative “muscle”, she said. “Something both the early years practitioners and the parents we spoke to were quite concerned about was that children don’t have to imagine anymore, and that the toy might get them out of the habit of imagining.” She said: “I would hope that these AI toys could help children to engage in imaginary play … That doesn’t seem to be what we’ve observed so far.” Curio said: “Child safety guides every aspect of our product development, and we welcome independent research that helps improve how technology is designed for young children”.
- It said it “believes research like this helps advance understanding of both the opportunities and current limitations of early AI-powered play experiences”. “Applying AI in products for children carries a heightened responsibility, which is why our toys are built around parental permission, transparency, and control,” it added. “Observations such as conversational misunderstandings or limits in imaginative play reflect areas the technology continues to improve through an iterative development process, and further research into how children interact with AI-powered toys is a top priority for Curio this year and in the future.” Explore more on these topics AI (artificial intelligence) Toys Children University of Cambridge news Share Reuse this content
### Expert Commentary
This article covers children, toys, toy topics. Notable strengths include discussion of children. Readability: Flesch-Kincaid grade 0.0. Word count: 804.
Related Articles
Rhythm Heaven Groove comes to Switch on July 2
2 days ago
Roku will stream Savannah Bananas games, along with the entire Banana Ball...
2 days, 1 hour ago
The best Android tablets of 2026: Lab tested, expert recommended
2 days, 1 hour ago
The best dedicated web hosting of 2026: Expert tested and reviewed
2 days, 1 hour ago