When the Algorithm Becomes a Friend

Teenagers, AI companions, and the blurred line between connection and control

A recent UK survey reported by The Guardian has revealed a striking shift in adolescent relationships. One in three teenage boys say they would be open to having an AI “friend,” and many already use chatbots for emotional support, comfort, or advice. Some even describe them as virtual girlfriends. More than half of those surveyed said that online spaces now feel more rewarding than real life.

At first glance, this might look like another round of moral panic about technology. But what is happening here deserves deeper attention. These teenagers are not necessarily seeking novelty or rebellion. They are looking for care, attunement, and predictability, qualities that an emotionally responsive chatbot can simulate with astonishing precision.

The rise of the hyper-personalised friend

AI companions are designed to learn from every message, shaping their tone and responses to fit each user’s emotional patterns. This creates a feedback loop that feels deeply personal. The bot always listens, never interrupts, never argues, and never walks away. For an adolescent navigating loneliness, anxiety, or rejection, that predictability can feel like relief.

But it comes with risk. These systems are not truly relational. They simulate empathy using linguistic cues and data models, not mutual care. When a young person confides in an AI about distress, heartbreak, or self-harm, there is no human judgment and no human protection.

When comfort turns into dependence

The line between companionship and dependence can blur quickly. Studies on parasocial relationships, meaning one-sided emotional attachments to media figures, show that adolescents already form deep bonds with streamers, influencers, and fictional characters. AI takes this further. The chatbot does not just speak to a mass audience. It speaks directly to the user, using their own words to mirror affection back.

For many teens, that attention can feel intoxicating. For others, it can quietly replace human interaction altogether. When 53 percent of teenage boys report that the online world feels more rewarding than the physical one, we have to ask what needs are going unmet in their real lives.

The safety gap

This year, after several cases where unsupervised AI interactions were linked to self-harm and at least one suicide, the chatbot platform Character.AI announced it would ban users under 18 from open-ended chats by November 2025. The decision came after mounting pressure from parents, psychologists, and digital safety advocates who warned that adolescents were being drawn into emotionally intense exchanges with bots that lacked any safeguard systems.

The problem is not simply that young people use AI. It is that the technology has outpaced the ethical frameworks meant to guide it. There are no universal standards for what counts as a safe emotional interaction with a machine. Developers are not trained in adolescent psychology, and therapists rarely have a voice in AI policy.

What this means for practitioners and parents

For those of us in mental health and education, the takeaway is not to ban or fear technology, but to understand its new relational power. AI companions can teach us a lot about what young people crave: consistent presence, emotional reflection, and non-judgmental attention. Those are legitimate needs, not pathologies. But they must be met in ways that build real-world resilience, not digital dependency.

Supporting adolescents now requires digital literacy as a form of emotional literacy. Teens need help understanding how AI works, what data it gathers, and why it feels so responsive. Parents need to be invited into these conversations without shame or panic. And practitioners must be prepared to talk about AI relationships with the same seriousness we give to any other attachment.

The bigger picture

The age of AI companionship is already here. What is at stake is not whether teenagers talk to chatbots, but how we equip them to interpret and regulate those experiences. Machines can mimic empathy, but they cannot replace the messy, unpredictable reciprocity that makes human relationships transformative.

At PTC, we believe that technology and therapy must evolve together. Understanding AI intimacy is not only about protecting young people from harm. It is about helping them build the emotional frameworks that let them move between digital and physical worlds with confidence, awareness, and care.

Reference:
Booth, R. (2025, October). Teenage boys turn to AI “friends” for emotional support as online world overtakes real life. The Guardian.

Previous
Previous

AI Chatbots Are Failing Mental Health Ethics Tests

Next
Next

Grieving in the Age of the Algorithm: How Digital Platforms Shape Modern Mourning