Here’s a stat that made me stop and think: in Australia, nearly 1 in 4 Gen Z young adults say they’d rather ask AI for relationship advice than talk to a person. Even more—almost half—use AI for emotional support.
On one hand, this isn’t shocking. Today’s young people grew up with technology as a constant companion. For them, talking to an AI can feel easier than opening up to a parent, teacher, or even a close friend. It’s quick, private, and judgment-free.
But here’s the catch: AI doesn’t really understand us. It can respond with empathy-sounding words, but it can’t truly care. It doesn’t pick up on the subtle signs of distress, and it won’t call a friend or reach out when someone is in danger. In fact, the same survey showed that 37% of Gen Z worry AI could take away human control, and two-thirds are already calling for stronger regulation.
Why This Matters
If young people start depending on AI for emotional support, we risk creating a world where vulnerability is answered with scripts instead of care. That might work for day-to-day stress, but it falls short when life gets heavy. Human connection can’t be automated.
What We Can Do About It
- Add Safeguards: AI tools in wellness spaces should always include disclaimers and redirect users to human help when things get serious.
- Build Transparency: Users deserve to know when they’re talking to a bot—and where the limits are.
- Educate & Empower: AI literacy programs can help young people see AI as a support tool, not a replacement for trusted relationships.
At Freeland AI Collective, we believe in using AI to enhance well-being, not replace the heart-to-heart conversations that make us human. Let’s give our youth tools that encourage resilience while reminding them the most powerful help still comes from people who care.
What do you think—would you be comfortable if your child or younger sibling confided in an AI chatbot instead of you?
