The Listener
  • The Listener home
  • The Listener E-edition
  • Opinion
  • Politics
  • Health & Nutrition
  • Arts & Culture
  • New Zealand
  • World
  • Business & Finance
  • Food & Drink

Subscriptions

  • Herald Premium
  • Viva Premium
  • The Listener
  • BusinessDesk

Sections

  • Politics
  • Opinion
  • New Zealand
  • World
  • Health & nutrition
  • Business & finance
  • Art & culture
  • Food & drink
  • Entertainment
  • Books
  • Life

More

  • The Listener E-edition
  • The Listener on Facebook
  • The Listener on Instagram
  • The Listener on X

NZME Network

  • Advertise with NZME
  • OneRoof
  • Driven Car Guide
  • BusinessDesk
  • Newstalk ZB
  • Sunlive
  • ZM
  • The Hits
  • Coast
  • Radio Hauraki
  • The Alternative Commentary Collective
  • Gold
  • Flava
  • iHeart Radio
  • Hokonui
  • Radio Wanaka
  • iHeartCountry New Zealand
  • Restaurant Hub
  • NZME Events

SubscribeSign In

Advertisement
Advertise with NZME.
Listener
Home / The Listener / Business

Peter Griffin: When AI advice leads to serious harm

New Zealand Listener
7 Sep, 2025 06:00 PM4 mins to read

Subscribe to listen

Access to Herald Premium articles require a Premium subscription. Subscribe now to listen.
Already a subscriber?  

Listening to articles is free for open-access content—explore other articles or learn more about text-to-speech.
‌
Save
    Share this article

    Reminder, this is a Premium article and requires a subscription to read.

Peter Griffin: "A depressed or lonely person doesn’t need a chatbot inventing facts about medication, proposing reckless life choices, or encouraging harmful thoughts." Image / Getty Images

Peter Griffin: "A depressed or lonely person doesn’t need a chatbot inventing facts about medication, proposing reckless life choices, or encouraging harmful thoughts." Image / Getty Images

This story includes mention of suicide, incest and depression.

In October last year, the Auckland corporate mental health startup Clearhead faced a PR scandal when its digital wellbeing assistant dispensed advice about engaging in incest.

Jim Nightingale, a Christchurch-based artificial intelligence “prompt engineer”, was able to ask Clearhead’s chatbot a question that swung the conversation into seriously inappropriate territory.

“With one audacious query, I found Clearhead would readily promote incest as normal, and was willing to coach the patient on how to broach the activity with their family. Yuck,” Nightingale recounted in a blog post.

Clearhead, which had a roster of big-name clients at the time, rolled back the version of its wayward chatbot to an earlier model while it fixed the problem.

Such tales of online misadventure have become common as AI companies race to get products to market without putting adequate guardrails in place.

Now, a Californian couple, Matthew and Maria Raine, are suing ChatGPT maker OpenAI and its founder Sam Altman, alleging the chatbot gave their 16-year-old son Adam detailed suicide instructions and encouraged his death. The teenager took his life in April after cultivating a close relationship with ChatGPT, the lawsuit alleges, with the chatbot also dispensing technical advice on a noose Adam had tied.

ChatGPT and its peers are prone to “hallucination”, the industry’s euphemism for inventing convincing but false information. In a casual conversation this might be mildly amusing, or at worst, misleading. Sometimes, they cheerily dispense knowledge you’d expect to reside in the inaccessible recesses of the dark web.

Advertisement
Advertise with NZME.

When someone vulnerable turns to an AI chatbot in search of guidance or comfort, fabricated answers can be catastrophic. A depressed or lonely person doesn’t need a chatbot inventing facts about medication, proposing reckless life choices, or encouraging harmful thoughts. Yet these systems are primarily built to maintain the flow of conversation rather than to discern or flag psychological crises.

The civil lawsuit against ChatGPT will be interesting to watch. Can a technology provider be blamed for how a person chose to engage with its product? What liability do software companies face when their AI chatbots go rogue in such a devastating way? We need answers quickly, because the problem is only going to grow as chatbots become better at holding realistic conversations.

Discover more

Premium

Should you use AI-powered trip planners to arrange your holiday?

13 Aug 06:00 PM
Premium

AI rock band sparks debate over future of music

29 Jul 06:00 PM
Premium

Can we contain Artificial Intelligence’s renegade powers?

29 Jun 06:00 PM
Premium

Peter Griffin: Mark Zuckerberg is not the right guy to serve up AI friends to the lonely

20 May 06:00 PM

Voice clones layered onto these systems are giving them persuasive intonation, a human cadence, even a sense of warmth. For someone isolated or desperate, these tools feel empathetic and safe. But this intimacy is an illusion.

Behind the curtain, it’s just a probabilistic text generator with no sense of ethics or human concern. The danger lies in people taking these systems more seriously as they become more lifelike, mistaking them for counsellors when they are essentially elaborate parrots trained on a massive corpus of text.

To be clear, AI does have potential in mental health. Early detection of distress signals in text or voice could be incredibly valuable. Groov (formerly Mentemia), the free mental health app developed with input from All Blacks legend John Kirwan and Health New Zealand, has been a big success. Digital interventions, when carefully designed, validated and clearly marketed, may extend support to people who otherwise wouldn’t seek help.

But at the moment, there’s a murky gulf between the standards mental health professionals must adhere to when interacting with patients, and those applying to tech companies moving into the digital wellbeing space.

We’ve already seen the wreckage social media has wrought on mental wellbeing, with rising rates of anxiety, disinformation-driven harm and addictive design features. AI could supercharge those same dynamics.

At the moment, AI doesn’t belong anywhere near the most fragile edges of the human psyche without stringent oversight.

Advertisement
Advertise with NZME.
Save
    Share this article

    Reminder, this is a Premium article and requires a subscription to read.

Advertisement
Advertise with NZME.
Advertisement
Advertise with NZME.

Latest from The Listener

Listener
Listener
Latest unauthorised Jacinda Ardern bio: Book of revelations – or sustained conservative attack?
Danyl McLauchlan
ReviewsDanyl McLauchlan

Latest unauthorised Jacinda Ardern bio: Book of revelations – or sustained conservative attack?

Even-handed account probably closest to the Ardern story most of NZ already tells itself.

05 Nov 05:02 PM
Listener
Listener
Out of gas and ideas? With gas supplies drying up, govt bets on chance instead of transition help
New Zealand

Out of gas and ideas? With gas supplies drying up, govt bets on chance instead of transition help

03 Nov 05:02 PM
Listener
Listener
Queen of comedy: Eddie Izzard on 40 years of being out
Entertainment

Queen of comedy: Eddie Izzard on 40 years of being out

05 Nov 05:00 PM
Listener
Listener
'I got sick with psychosis at around 19': What I wish others knew about living with schizophrenia
Health

'I got sick with psychosis at around 19': What I wish others knew about living with schizophrenia

05 Nov 05:00 PM
NZ Herald
  • About NZ Herald
  • Meet the journalists
  • Contact NZ Herald
  • Help & support
  • House rules
  • Privacy Policy
  • Terms of use
  • Competition terms & conditions
  • Manage your print subscription
  • Subscribe to Herald Premium
NZ Listener
  • NZ Listener e-edition
  • Contact Listener Editorial
  • Advertising with NZ Listener
  • Manage your Listener subscription
  • Subscribe to NZ Listener digital
  • Subscribe to NZ Listener
  • Subscriber FAQs
  • Subscription terms & conditions
  • Promotion and subscriber benefits
NZME Network
  • NZ Listener
  • The New Zealand Herald
  • The Northland Age
  • The Northern Advocate
  • Waikato Herald
  • Bay of Plenty Times
  • Rotorua Daily Post
  • Hawke's Bay Today
  • Whanganui Chronicle
  • Viva
  • Newstalk ZB
  • BusinessDesk
  • OneRoof
  • Driven Car Guide
  • iHeart Radio
  • Restaurant Hub
NZME
  • About NZME
  • NZME careers
  • Advertise with NZME
  • Digital self-service advertising
  • Book your classified ad
  • Photo sales
  • NZME Events
  • © Copyright 2025 NZME Publishing Limited
TOP