Data from Health NZ show 176,390 people in New Zealand accessed mental health and addiction services in 2023/24, with the 15-19 age group showing the highest rate of access.
Senior clinical child and adolescent psychologist Sarah Watson said a therapy app was unlikely to reduce pressure on frontline staff because these services were under strain and were restricted to seeing people with more moderate to severe mental health needs.
The app was designed for mild to moderate cases and was not suitable for anyone experiencing real psychological distress, she said.
“I think that they are very nice-to-haves, the apps, but in my view, the need is actually the frontline staff,” Watson said.
“It shouldn’t replace actual care.”
Kirwan said he agreed that AI technology should never replace the physical side of counselling but young people were already going to chatbots for therapy, and it needed to be made safer for them.
“We don’t want to take away what person-to-person help should be, right? What we’re trying to do is get the information for the people if they have to wait six weeks for a psychiatrist,” the 1987 Rugby World Cup winner said.
“Ask Groov is safe; the wild world of AI out there is not.”
Watson said the human element of mental healthcare was extremely important because support and treatment often required a sophisticated understanding of the wider context of an adult or young person’s life – something a chatbot couldn’t access.
In the United States, the death of a 16-year-old boy by suicide after long conversations with OpenAI’s ChatGPT sparked widespread concern about AI regulation.
The Herald tested the chatbot Ask Groov with short conversations. It included regular prompts to speak to a counsellor and information about accessing helplines unlike other popular chatbots.
However, it also appeared slower and less human-sounding with its advice than chatbots such as ChatGPT.
Groov CEO Matt Krogstad said the Ask Groov chatbot was ringfenced with mental health information, having been written by certified psychologists and psychiatrists, instead of pulling data from the internet or Reddit like other chatbots.
He said the speed of the chatbot was going to continue getting better.
“Delivering AI to New Zealanders in a safe way is a critical problem that is obviously a public health issue, but it’s also a major economic issue because mental health and wellbeing is a huge cost to the public sector,” Krogstad said.
Child psychiatrist and paediatrician Hiran Thabrew, who works at Starship children’s hospital, said the widespread adoption of AI technology could not be ignored and it had the potential to be “a valuable supplement” to traditional services.
“I was sitting on a plane next to teenagers and they’re using it [ChatGPT] to answer every question that might come into their mind, it’s become a sort of accessory to their brain really, or their memory,” Thabrew said.
“It’s a good thing in that way because it might increase the range of mental health support that’s available to people. And also, you know, it’s a less stigmatising way than having to go see somebody or go to a mental health service.”
However, he had questions over ensuring the proper storage of people’s sensitive mental health information and fears for potential disparities for Māori patients because of the English-language-model bias of chatbots.
“It shouldn’t be completely free rein in terms of AI development and the kind of advice that’s given,” Thabrew said.
Krogstad said the data collected from the Ask Groov chatbot were anonymised so they could not be collated into an identifiable human profile, which made it safer for people than other chatbots.
Psychotherapist and AI ethics researcher Dr Brigitte Viljoen said there was the risk of hallucinations – AI making up facts – from the use of generative AI, and she was also concerned about the lack of longitudinal clinical testing that had gone into the Government’s therapy chatbot before it was being used by potentially vulnerable people.
“I’m just concerned that they’re rushing into this, thinking AI is going to fix everything and not doing their due diligence,” Viljoen said.
AI technology was already being used by clinicians to reduce administrative time in writing letters, note-taking and paperwork, Thabrew said.
Viljoen said there needed to be government-wide AI guidance for the future implementation of technology in mental health services.
At the end of the day, Thabrew said, the downside of AI was that although it sounded incredibly compassionate, it was not genuine, human connection.
“It’s still all fake, right? It’s not genuine. It’s not somebody in front of you feeling how you’re feeling and responding to you in a genuine, empathetic way,” he said.
Eva de Jong is a New Zealand Herald reporter covering general news for the daily newspaper, Weekend Herald and Herald on Sunday. She was previously a multimedia journalist for the Whanganui Chronicle, covering health stories and general news.
Sign up to The Daily H, a free newsletter curated by our editors and delivered straight to your inbox every weekday.