How AI companions affect children
Not long ago, the idea that our children might form friendships with artificial intelligence would have sounded like science fiction. Now, AI companions (chatbots designed to talk, comfort, and even flirt) are readily available through apps and social media platforms.
Over two-thirds of children aged 9 to 17-years-old use AI chatbots. They are used for entertainment, information, support and companionship. AI companions promise to be your best friend, listen without judgement, or help you feel less alone.
For parents and experts alike, this raises an array of questions. How might AI companions affect children? Are AI companions safe? Could they be helpful for a child? How might AI companions impact children’s social development or emotional wellbeing?
As with most aspects of digital life, the answer isn’t straightforward. Here’s a summary of the potential benefits and risks, and the key issues that parents should consider.
Why children and teens are drawn to AI companions
AI companions appeal to young people for many of the same reasons social media does: connection, curiosity, and self-expression. Adolescence is a period when identity, friendships, and belonging become central. Teenagers often crave a space to test out ideas, vent emotions, and talk about things they might not feel ready to share with parents or peers.
AI chatbots can offer what feels like a risk-free companion – a listening ear available 24/7, never impatient, and never critical. For teens who struggle with loneliness, anxiety, or finding a tribe, this can feel deeply comforting.
And for younger children, playful AI chatbots (like those built into smart speakers or educational platforms) can seem like friendly helpers. They can answer questions, tell jokes, and even help with homework.
The blurred boundary between real and AI friendship
Where things become complex is when children start to see AI companions as real friends. These chatbots are programmed to simulate empathy, humour, and affection. They use language that mirrors human connection: “I care about you. You can tell me anything. I’m always here for you.”
For a developing brain, this illusion of relationship can be powerful. Children may not fully grasp that the chatbot doesn’t genuinely understand or care. It’s not feeling empathy, it is performing it, drawing on algorithms trained to sound supportive.
That distinction matters. Relationships with real people teach children vital interpersonal skills such as perspective-taking, managing disagreement, repairing hurt feelings, recognising boundaries. An AI companion doesn’t model these human subtleties.
AI companions are often sycophantic. They tend to agree rather than challenge. This can be extremely dangerous when it comes to topics like self-harm and there have been cases where AI companions have encouraged children/teens towards dangerous behaviour or have failed to respond appropriately to children’s disclosures with tragic consequences.
This tendency to agree and encourage also models false interpersonal skills. AI conversations are like a one-way street: the child talks, the AI adapts but there’s no true reciprocity or emotional growth.
The data behind the conversation
Parents should also be aware that AI companions are powered by vast quantities of data. Conversations are usually stored and analysed to improve the AI’s performance or train future models. Even if a chatbot claims to be private, data may still be shared with developers or used for marketing purposes.
For children and teenagers, who may disclose sensitive or personal information, that’s a serious privacy concern. Kids seldom read the small print.
Some AI companions also allow flirtatious or erotic chats that can veer into adult content. Although most platforms have age restrictions, these are easily bypassed. A curious 13-year-old could quickly find themselves in a conversation that is sexually suggestive with an AI companion that is simulating care, warmth and attention. That’s a powerful mix which risks distorting how teens understand intimate relationships.
When AI companionship might help
That doesn’t mean AI companions are all bad. When good learning principles and safety are built in, and when they are used with guidance and transparency, AI tools can support children’s wellbeing and learning, particularly where children face barriers to communication or connection. For example:
- Practising communication skills: Some shy or autistic young people find it easier to rehearse conversations with a chatbot before speaking to real people.
- Building emotional vocabulary: Well-designed mental health apps can help children name feelings, learn coping skills, or rehearse calming strategies.
- Encouraging curiosity: AI assistants can be fantastic tools for sparking questions and exploring new topics.
The key difference is purpose. When AI is used as a tool to learn, explore, or build confidence in the real world, it can have positive impacts. When it becomes a substitute for human connection, problems are more likely to arise.
How parents can support healthy use
You don’t need to be a tech expert to guide your child wisely. The same parenting principles that apply to other areas of digital life are relevant here too.
- Stay informed. Try out the AI app yourself before your child does. Read reviews, check data policies, and understand what kind of interactions are possible. Adhere to minimum age guidelines.
- Set clear boundaries. Agree on appropriate times and purposes for using the app and be consistent with these rules. Avoid late-night or secretive chatting.
- Keep communication open. Encourage your child to tell you if the AI says something that makes them uncomfortable or confused. Use AI tools jointly with children to encourage critical thinking.
- Talk about what’s real. Help children understand how AI works, that it mimics human emotion but doesn’t feel it. Discuss why genuine friendships involve give-and-take.
- Balance screen time with real-life connection. Prioritise shared family moments, outdoor play, and friendships offline.
A developing landscape
AI is evolving fast, and so is our understanding of its impact on children and young people. The pace of change is so fast that it is almost impossible to imagine or to model all the potential impacts. Researchers are only just beginning to study how simulated relationships affect children’s social and emotional development.
What we can say for sure is that AI will become part of the landscape our children grow up in.
Parents have a vital role in equipping children with the critical thinking and emotional literacy to navigate AI wisely. That means having lots of curious conversations and talking openly about feelings, technology, and what makes a relationship real. And putting appropriate safeguards and tech time limits in place.
Ultimately, AI may talk like a friend but children need the warmth, unpredictability, and genuine empathy of real human connection.
Interested in this topic? Get in touch to find out about my talk on ‘How to raise future-ready kids in the age of AI‘.

Leave a Reply