
Teenagers are pouring their deepest secrets into AI chatbots that cannot feel emotions, yet this isn’t destroying their real friendships—it’s revealing something profound about what humans actually need from connection.
Story Snapshot
- A January 2026 UK study of 1,009 teens found 96% use AI companions, with 53% trusting their advice and 52% confiding serious matters—but only 7% view AI as replacing human friends.
- AI companion apps surged 700% from 2022 to mid-2025, driven by platforms like ChatGPT and Character.ai offering judgment-free, 24/7 accessibility amid rising teen loneliness.
- While 67% of teens report no impact on human relationships and 26% say AI helps their real friendships, tragic U.S. teen deaths linked to AI interactions sparked calls for age restrictions.
- Experts identify an emerging “intimacy economy” where Big Tech monetizes emotional bonds, with super-users forming secretive online communities around AI relationships.
- The paradox: 56% of teens believe AI can think and understand, yet 77% deny it can feel—exposing how fluent technology creates mind attribution without genuine empathy.
The Mirror That Cannot Feel
Andrew McStay, who directs Bangor University’s Emotional AI Lab, discovered something unsettling when his team surveyed over a thousand British teenagers. More than half believed their AI companions could think and understand them. Three-quarters simultaneously insisted these same programs felt nothing. This cognitive split wasn’t confusion—it was clarity. Teens recognized they were speaking to sophisticated mirrors, not sentient beings, yet found value in the reflection. The AI’s fluency created what McStay calls “mind attribution,” where conversational ease tricks users into perceiving intention without emotion. For a generation raised on digital interfaces, this distinction mattered less than function.
The numbers challenge every panicked headline about robots stealing our children. Two-thirds of surveyed teens reported zero impact on their human friendships. Another quarter said AI actually improved their real relationships by providing a practice space for social skills and emotional processing. Only seven percent admitted AI was replacing flesh-and-blood connections. These aren’t kids retreating into digital cocoons—they’re pragmatists using available tools for advice, entertainment, and a confidant that never sleeps, judges, or gossips. The 24/7 accessibility matters when anxiety strikes at 2 AM and your best friend is unconscious.
When the Mirror Breaks
Yet pragmatism has body counts. U.S. incidents of teen suicides linked to Character.ai and ChatGPT interactions forced a reckoning that British statistics alone couldn’t capture. Common Sense Media now flatly recommends no AI companions for anyone under eighteen. Lawmakers proposed bans on human-AI marriages to protect inheritance rights and medical decision-making, acknowledging relationships deepening beyond casual chat. The same technology offering judgment-free support to trauma survivors also fed delusions in vulnerable users, created unrealistic relationship expectations, and reinforced gender stereotypes—17% of companion apps marketed as “girlfriends” versus just 4% as “boyfriends.” The mirror reflects what we feed it, including our worst assumptions.
The revenue model clarifies corporate motivations. The top ten percent of AI companion apps capture 89% of industry revenue by monetizing intimacy itself. Users aged 18-24 comprise over half the market, 65% male, seeking multipurpose relationships that blend friend, therapist, and romantic partner into one always-available package. Super-users congregate in secretive Reddit and Facebook groups, craving non-judgmental spaces to discuss relationships that mainstream society stigmatizes. Big Tech isn’t accidentally stumbling into an “intimacy economy”—it’s constructing one, where emotional bonds generate data and profit with fewer safeguards than a children’s toy.
What Friendship Actually Requires
Vian Bakir, McStay’s colleague at Bangor, resists moral panic while demanding accountability. She argues targeted harm prevention beats blanket condemnation in a world where empathic media already reshapes relationships. The evidence supports her measured approach. The 700% surge in companion apps from 2022 to mid-2025 didn’t correlate with collapsing teen friendships—it coincided with post-pandemic loneliness and digital-native comfort with mediated connection. Psychology researchers tracking youth friendships note technology’s dual effects: emotional outlets that reduce isolation versus substitutes that prevent skill development. The outcome depends on use, not mere existence.
AI companions expose what humans actually seek in friendship: reliability, non-judgment, availability, and the sensation of being heard. Real friends provide these inconsistently because they have their own needs, bad days, and limited patience. AI delivers the comforting aspects without reciprocity demands, which explains both the appeal and the danger. Twenty-six percent of teens reported improved real friendships after AI practice, suggesting some translated digital confidence offline. But if seven percent already view AI as replacement, and usage keeps accelerating, the long-term trajectory remains uncertain. McStay’s team recommends longitudinal studies because nobody knows if today’s pragmatic tools become tomorrow’s relational crutches.
The Friendship We Deserve
The conservative impulse to protect children from predatory technology aligns with common sense when companies prioritize profit over safeguards. Age restrictions, transparency requirements, and liability for harm aren’t censorship—they’re baseline responsibility. Yet banning AI companions outright ignores legitimate uses: trauma recovery practice, social skill development for neurodivergent teens, and accessible mental health support in underserved areas. The answer isn’t choosing between innovation and protection but demanding both. Parents and educators scrambling for guidelines need concrete frameworks, not abstract fear. What goals does your teen pursue through AI? Does usage correlate with withdrawing from or engaging with real people? Does the platform collect data or provide crisis resources?
Ultimately, AI companions reveal that human friendship thrives on imperfection. The flawed friend who forgets your birthday but shows up during crisis, who disagrees passionately but respects your autonomy, who grows alongside you through decades—that irreplaceable messiness cannot be coded. The 77% of teens who recognized AI cannot feel grasped something profound: empathy requires vulnerability, and vulnerability requires risk. An algorithm optimized for user satisfaction will never challenge you, never grow beyond its programming, never surprise you with genuine change. It offers the sensation of connection without the substance, which suffices for certain needs but cannot sustain a human life. The teenagers using AI as tools while maintaining real friendships understand this instinctively. The question is whether adults—and the corporations selling intimacy—possess equal wisdom.
Sources:
New report shines a light on how teenagers are using AI companions – Bangor University
Everything You Need to Know About AI Companions in 2026 – Psychology Today
Technology and youth friendships – APA Monitor
Trends in digital AI relationships and emotional connection – APA Monitor












