It's time to understand how artificial intelligence (AI) is changing teenage relationships. Photo / 123rf
It's time to understand how artificial intelligence (AI) is changing teenage relationships. Photo / 123rf
THE FACTS
Artificial intelligence (AI) has become a source of emotional support and companionship for 23% of New Zealand school children, according to cyber security firm Norton.
The Norton Connected Kids survey found the average baby boomer got their first mobile phone at age 41 but Gen Z kids born from the late 1990s through to the early 2010s did so at age 14.
Only 41% of Kiwi parents said they had discussed AI dangers such as deepfakes and misinformation with their children.
Like many others around the world last week, I was shocked to read that a teenager in the United States died after turning to ChatGPT for support. It’s a heartbreaking reminder that while artificial intelligence tools can feel immediate and accessible, they are not equipped to respond withthe empathy or safeguarding that vulnerable children and young people need.
With almost 500 million people using just ChatGPT a day, it is clear that generative AI (GenAI) is quickly becoming integrated into our daily lives for a diverse range of uses. And young people are often among the earliest adopters.
Closer to home, research from cyber security firm Norton shows that almost one in four school children in Aotearoa are already using artificial intelligence (AI) chatbots for emotional support and companionship. For some, this trend may feel confronting. For us at What’s Up, Barnardos’ counselling helpline for children aged 5 to 19 years old, it reinforces something we have long understood: children and young people are looking for safe, confidential ways to express their feelings online – but how do they know what is safe?
To begin ensuring that services are meeting the needs of young people, we must first understand why GenAI offers a solution where they feel other options do not. We already know that before AI, chat functions were already a popular way for young people to communicate with one another, as well as to express their feelings, allowing them to feel in control of what they had to say.
More than 10 years ago, What’s Up introduced a webchat service alongside our phone line to address exactly this demand from those we help. We recognised early that many children feel more comfortable typing than talking – and that choice matters. Chatting allows them to stay private, share at their own pace and express things that might feel too hard to say out loud. Over time, we’ve seen a steady rise in chats; today, more young people reach out to us this way than by phone.
But there’s a vital difference between a helpline like ours and a chatbot. Every message is answered by a trained counsellor; a real person who listens with empathy, can respond if there’s any concern for safety and can walk alongside a child over time. Many children choose to come back to the same counsellor again and again, building trust and continuity. We don’t find answers to their problems, we work with them so they find solutions. That kind of human connection can’t be replicated by AI.
We all have a role to play. As parents, caregivers, teachers and coaches, we can talk with children about the difference between GenAI and human interaction over a bot and make it clear where value is gained through talking with others, even if it’s through an online chat. Services like What’s Up are anonymous, confidential and always answered by someone who cares, meaning the opportunity to receive additional human support is always there.
Dharshana Ponnampalam is the What's Up manager at Barnardos Aotearoa.
As we start to think about a future where AI is ever-present, we must begin to discuss what options we are also providing for human interaction. We need to create the opportunity for tamariki and rangatahi (children and young people) to not only have rich life experiences such as AI, but to also hear about the genuine experiences of others and learn about humanity from humans.
It matters, because we know that the mental health challenges children and young people face are not going away, as much as we would like them to. While GenAI gives fast and coherent answers, in its current state, it cannot provide the human elements of professional care, kindness, compassion and understanding that young people need to deal with their emotions, process trauma and build resilience.
Importantly, GenAI is not able to provide them immediate help, should they be immediately unsafe. Our counsellors answer more than a hundred calls or chats each year from children in immediate danger. Our counsellors have developed safety plans to de-escalate immediate risks and/or refer tamariki to specialised services, including mental health providers, Oranga Tamariki and the police as appropriate.
So it may now be time to question whether tech companies need to step up and play their part. Just as media stories with trigger warnings or mentions of suicide are to provide helpline information‚ as you’ll see at the bottom of this article, AI tools could direct children and young people to trusted services.
Pointing them to real human help is a simple safeguard that could save lives.
Ultimately, what matters most is that children and young people can get the right help when they need it. Right now, long waitlists mean many can’t access face-to-face support quickly. That’s where helplines play a vital role; providing immediate, professional care from the moment a child reaches out.
We also need to make sure there are enough trusted options available, allowing young people to access human connection when they need it most.
Children are already talking. The question is: who’s truly listening? We must make sure it’s someone who can hear them and help them when it matters most.
Dharshana Ponnampalam is the What’s Up Manager at Barnardos Aotearoa. What’s Up is a counselling helpline for children as young as 5 – and counsellors answer over 12,000 calls and chats from children and young people each year.