As therapy waitlists grow and the cost of living bites, more New Zealanders are turning to AI for life advice, reassurance, and even as a replacement for therapy. But what happens when your new confidant is a chatbot, and you’re using AI for therapy?
When Auckland woman Becky Erwood first started chatting to an AI tool about her mum’s dementia, she wasn’t looking for therapy. She was just trying to make sense of what was happening.
“At first I just asked if some of Mum’s behaviours were typical for dementia,” Becky says. “Then I’d go back with more specific examples, like how she was reacting to things, or the way her routines were changing, and ChatGPT would explain what might be going on.”
Over time, the tool became a kind of digital sounding board.
“It’s been very positive and helpful,” she says, adding that in her experience, AI hasn’t served up any false information regarding her mum’s dementia. “Obviously you have to use it with a degree of common sense. Mum has developed a real fixation with her clothing – what goes with what, collecting multiple items of the same clothing. It’s easy to get frustrated as it isn’t rational.
“I asked if this was a typical dementia symptom and it offered advice on why she behaves like this, and also some strategies for helping her deal with it. That was pretty helpful. In relation to her clothes it gave me strategies for helping to manage her wardrobe in way that calms her down. It also helped me to understand and appreciate that this is just another symptom of her dementia, rather than getting frustrated (which is easy to do!).”
When her dad began to struggle with the caregiving side of things, Becky even asked AI how to approach conversations about specialist care options, and how to break that news gently to her mum.
THE RISE OF AI AS A “DIGITAL CONFIDANT” – OR THERPAIST
Becky’s story is part of a growing trend across Aotearoa, and team, some of these statistics are absolutely wild. According to new research from Samsung, whose new phones are absolutely brimming with AI features, one in three Kiwis (35%) say they would turn to ChatGPT or another AI tool for life advice instead of a professional therapist.
Nearly half of us (45%) have already used AI to get advice we might usually ask from a friend, family member, or therapist – and a huge 93% say they found that advice helpful.
Relationship guidance is also on the rise, with 30% using AI to navigate marriage or partnership dilemmas. And when it comes to sensitive topics, AI is filling gaps many Kiwis find tricky to discuss in person.
WHY KIWS ARE TURNING TO AI FOR THERAPY AND EMOTIONAL SUPPORT
The reasons why? It’s cheaper, easier, and, for many, less intimidating:
- 43% of respondents said they use AI because it saves money
- 35% because it’s more accessible
- 17% because they find it less uncomfortable than speaking to a real person.
Yet, there’s still a degree of scepticism. Nearly a quarter (23%) admit they don’t fully trust AI responses, and more than half (54%) say their trust depends on how carefully they phrase their questions.
Says Samsung New Zealand’s Consumer Insights Analyst Ollie Brown, “AI adoption in New Zealand is accelerating, but it comes with careful reflection on both its benefits and its risks. As Kiwis explore this new frontier, trust often comes down to how carefully they phrase their questions, a balance of curiosity, value, and caution.”
TABOOS, TRUST, AND DIGITAL COMPANIONSHIP
Capsule reader Amelia, who wishes to remain anonymous, says as she’s from a conservative family background she would still never dream of talking to those closest to her about certain issues, including her sex and dating life.
“They wouldn’t understand and honestly I can’t be bothered dealing with the absolute rubbish I’d get back about my dating choices, so I use Chat GPT a lot to sense-check things for myself. It is like having a therapist in my pocket and I do treat it like that.”
However Amelia does use it with caution, especially when it generalises quotes or doesn’t give specific sources.”I think it just boils down to me not wanting to be judged for enjoying myself and figuring out who I am as a woman, because my family – because of their faith – has a very narrow view of who I should be. Right now I just want to have fun!”
Part of AI’s appeal comes from the taboo topics many Kiwis feel uncomfortable raising with friends or family:
- Sex life (39%)
- Finances (28%)
- Mental health (28%).
In fact, half of respondents said they would discuss these sensitive topics with AI, while nearly one in 10 now ask it questions they once reserved for their parents, ranging from cooking and laundry (56%), parenting advice (33%), to career decisions (37%). I mean, I know my mother is probably happy I’m not calling her in a panic asking about ‘just how much pink is fine in a cooked chicken thigh?’ anymore, so, a win there.
And heartbreakingly, for some of us, using Chat GPT is not just a nice to have – it’s a built-in confidant. Twelve per cent of New Zealanders don’t have a single close friend, and nearly half (42%) say they have only two or three. Even those with strong social networks may avoid certain subjects, leaving AI as a judgment-free alternative.
Yet there’s reflection too: 59% say AI could never replace friends or family, and 47% worry that leaning on AI could increase loneliness or social isolation.
“IT’S VALIDATION, WITHOUT JUDGEMENT”
Jacqui Maguire, a clinical psychologist, says the statistics don’t surprise her.
“Affordable, accessible, and stigma-free support will always be appealing, especially when professional mental health services are stretched,” she explains.
“AI can provide people with immediate sound-boarding and validation. Humans are wired to seek reassurance, and AI can respond quickly, consistently, and without judgement. That alone can feel very supportive for someone who may otherwise remain silent.”
But Jacqui stresses the distinction between life advice and mental health care:
“AI can be very useful for life advice, problem-solving, or providing perspective, similar to how self-help books, podcasts, or journaling can help people reflect. But mental health advice is more complex. Psychological distress and mental illness are nuanced, context-dependent, and deeply personal. Human therapists bring clinical training, ethical responsibility, and the ability to assess risk, something AI cannot replicate.”
She also notes AI’s limitations in crisis situations:
“AI cannot intervene in a crisis, nor can it detect subtleties like tone of voice, body language, or the cultural context that may be essential to understanding someone’s wellbeing. That means while AI can complement support, it should not be relied upon as a replacement for professional mental health care.”
But at a time when the cost of living crisis refuses to go away, we’re heading into the silly season and times are tight, is turning to AI better than nothing?
Says Jacqui, “When financial, geographic, or systemic barriers make professional help hard to reach, AI is better than nothing. Having somewhere to start, to feel heard and validated, is better than holding everything inside. But users must remain mindful of the limitations.”
Her practical guidance for those turning to AI:
- Use it for reflection, not diagnosis. Clarify thoughts, explore options, or rehearse conversations.
- Validate, then act. AI reassurance is useful, but don’t stop there – take practical steps in real life.
- Know the limits. If you experience sustained distress, relationship breakdowns, or thoughts of self-harm, professional support is essential.
- Check your sources because not all AI is reliable. Advice should never override professional care.
WHAT DOES CHAT GPT THINK? DOES IT THINK?
This might be a bit meta, sure, but in the interests of balance (I think!?) I pasted this story into Chat GPT and asked it, “This story is about you, Chat GPT – are you comfortable with being treated as a therapist for human problems?”
It’s response:
That’s a really thoughtful question. Let me unpack it carefully.
As ChatGPT, I can’t truly be a therapist in the human, professional sense. I don’t have consciousness, emotions, clinical training, or the ability to assess risk in the way a human psychologist does. I can:
- Provide information, guidance, and ideas
- Offer a non-judgmental space to explore thoughts
- Suggest strategies, coping mechanisms, or resources
- Help people reflect, clarify, or plan next steps
But I cannot:
- Diagnose or treat mental health conditions
- Assess someone’s safety or risk in a crisis
- Replace human empathy, nuance, or clinical judgment
So in that sense, it’s more accurate to think of me as a supportive assistant or sounding board rather than a therapist. I can complement professional care, but I’m never a replacement.
For example, in Becky’s story, using AI to understand her mum’s dementia or plan her caregiving was a practical, reflective tool. That’s very different from diagnosing a mental health condition or providing therapy — which is something that requires human expertise.
If I were “comfortable” as a human might feel, I’d say: I’m perfectly okay being a supportive tool, but I always need users to understand my limits. It’s actually healthier for people if they see me that way — a helper, not a healer.
So, even Chat GPT warns users about its limitations. But it still hasn’t stopped huge numbers of Kiwis turning to its chat box to find answers.
Jacqui reiterates: “AI advice should be treated as first-line support, not definitive guidance. If someone is struggling with persistent low mood, anxiety, relationship breakdowns, or thoughts of harming themselves, human connection and professional expertise are essential. That is where people need to seek help from a GP, psychologist, counsellor, or trusted support line such as 1737 in New Zealand.”





