News
Increase in patients turning to AI for therapy
As patients face barriers to mental health care, many are turning to ChatGPT for help, but experts say it brings a risk of devastating mistakes.
In less than 20 years, there has been a 50% increase in rates of mental ill-health among Australian youth.
With more Australians than ever before reporting barriers to accessible mental health care, GPs have pointed to a growing and worrying trend of patients experimenting with chatbots as a form of talk therapy.
Health experts say they have seen a rise in patients using artificial intelligence (AI) platforms, such as ChatGPT, to seek psychological support.
In one example, a TikTok user is seen writing her thoughts into a word document before entering instructions such as ‘read the following journal entry and provide an analysis of it’ into ChatGPT.
This comes at a time when mental health services continue to be difficult to access, with long wait times and closed books leaving many patients feeling unsupported.
Dr James Collett, a Psychologist and Senior Lecturer at RMIT University, has himself noticed the trend and says AI therapies are ‘here to stay’.
But he said the use of general platforms, such as ChatGPT, could lead to patients missing out on important elements of psychotherapy, such as personal trust and rapport, and so ‘might not be getting the best support’.
‘What we’re seeing is unsupervised seeking of mental health support online,’ Dr Collett told newsGP.
‘There might be cases where people are talking about topics that they realistically need support with, and we would be worried about their welfare, but that’s not coming to light because they’re using ChatGPT.
‘There’s probably some superficially useful therapeutic advice that it can draw on, but it’s not necessarily matching that to clients’ individualised needs.’
Dr Collett said AI could be used to compliment psychological therapy, such as when people are considering if they will seek psychological supports, and to provide scaffolding between sessions or once a course of treatment has been completed.
However, he said these new therapies must be developed by teams with training and experience in psychotherapy.
‘I don’t think that there’s any putting AI back in the bottle, it’s out in the world, so I think it would be naive to propose a message like “we should never use AI for anything to do with therapy”,’ Dr Collett said.
‘I’m sure there are people out there developing therapeutic-oriented AI with an evidence base behind them, I would envisage that is probably the ideal future of AI use in psychotherapy.’
This rise in AI therapy comes at a time when costs, lack of availability, and patients not knowing where to seek help were the top three barriers to people getting the care they wanted, according to a recent Australian Psychological Society survey.
Dr Cathy Andronis, Chair of RACGP Specific Interests Psychological Medicine, told newsGP there are many considerations for clinicians and patients as they consider how to best use AI.
‘While there are benefits for GPs using AI, mostly some time saving with note taking of the conversation, there are many more risks for GPs in a consultation with patients discussing sensitive mental health-related content,’ she said.
Mental health continues to be one of the top reasons patients are seeing a GP, with the RACGP’s 2024 Health of the Nation report finding psychological issues remain in the top three presentations for 71% of GPs.
In August, a four-year review from more than 50 leading psychiatrists, psychologists, and those with lived experience across five continents described the rise in youth mental health problems as a ‘global crisis’.
It found that in less than 20 years, there has been a 50% increase in rates of mental ill-health among Australian youth, with the peak age of onset 15 years old and 63–75% of onsets occurring before the age of 25.
In response, Dr Andronis highlighted that experienced clinicians use metacognition to understand and support their patients, and work towards helping them develop skills to manage on their own – something which AI cannot yet do.
‘An astute clinician recognises key themes and content which are affecting the patient and contributing to their problems, an AI transcriber cannot do this, as it requires reflective skills,’ she said.
‘This capacity for metacognition is the trademark of an experienced therapist.
‘As psychotherapists become more experienced, they focus less on content and more on the process components of therapy … this is our expertise.’
She said it remains to be seen if AI can ever develop the ability for these reflective skills.
‘There is research currently of AI assisted therapy using Chatbots – these are being trained by psychotherapists,’ Dr Andronis said.
‘They can only learn what we teach them. And if they are not well trained, nor taught by experienced therapists, they will make mistakes, including potentially fatal ones for the patient.’
Log in below to join the conversation.
AI artificial intelligence mental health
newsGP weekly poll
Do you think changes are needed to make the PBS authority approval process more streamlined for GPs?