News
Who is most likely to turn to ChatGPT for health information?
New Australian research has uncovered how many patients are using the AI tool for clinical advice and just what questions they are asking.
Women and those aged 35–44 were among those most likely to ask the AI platform a high-risk health-related question.
‘Don’t mistake your Google search for my medical degree.’
It is not uncommon to see signage stating exactly this in doctors’ offices around the country, warning patients, albeit with a humorous undertone, to avoid Dr Google.
But the rise of artificial intelligence (AI) tools, such as ChatGPT, indicate the demand for instant access to health information, and advice, isn’t going away any time soon. But questions around its accuracy are raising concerns about the potential spread of misinformation and the risk this poses to people’s health.
Now new research, undertaken by the University of Sydney, is helping to shed light on user trends, including insights into which demographics are more likely to use the platform.
Conducted in June 2024, researchers surveyed a nationally representative sample of 2034 people aged 18 and over and found 1523 (84.7%) were aware of ChatGPT and 187 (9.9%) had used the platform to obtain health-related information during the preceding six months.
Usage was most common among people aged 18–44, who lived in capital cities, were born in non-English speaking countries, spoke languages other than English at home, or had limited or marginal health literacy.
Questions asked most frequently related to learning about a specific health condition (48%), finding out what symptoms mean (37%), finding actions to take (36%), and understanding medical terms (35%).
Concerningly, the study revealed that 61% had asked ChatGPT at least one higher risk question, related to taking action that would typically require clinical advice.
This was more common among people born in non-English speaking countries (95%) and for those who spoke a language at home other than English (95%).
Melbourne GP Dr Preeya Alexander has made a name for herself debunking medical misinformation both inside and outside the consulting room.
She says most GPs are aware patients are turning to other mediums to seek health information and suspects the prevalence will only increase if out-of-pocket costs to see a GP continue to climb.
‘Data in the annual Patient Experience Survey found that the percentage of people not visiting the GP due to cost is going up,’ Dr Alexander told newsGP.
‘The Medicare rebate not reflecting what it takes to deliver general practice is a part of this issue.’
The research found that among those who had asked ChatGPT health-related questions, trust in the tool was found to be moderate.
Launched in 2022, ChatGPT has given the public easy access to generative AI, with the platform attracting 300 million weekly users worldwide as of December 2024, and that number is expected to rise.
The new research backs this when it comes to people seeking out health information and advice.
Among the respondents who were aware of ChatGPT but had not yet used it for health-related questions during the preceding six months, 591 people (38.8%) reported they would consider doing so in the next six months.
The most common reasons cited were for learning about a specific health condition (18.1%), understanding medical terms (16.8%), or finding out what symptoms mean (16.3%).
Further to that, almost 25% said they would consider asking at least one higher risk-type question. This was most common among female participants, those aged 35–44, and 55 and older, and for those with year 12 education or less, or an advanced diploma or diploma.
While the study estimates that 9.9% of Australian adults – about 1.9 million people – asked ChatGPT health-related questions during the first half of 2024, the authors note that this is likely to be a conservative estimate given the rapid growth in AI technology and the availability of other similar tools.
International research published in 2023 found ChatGPT provides medical information of ‘comparable quality to available static internet information’, but found it remains of ‘limited quality’.
This raises particular concerns, given the latest Australian research found that health-related ChatGPT use was higher for groups who face barriers to health care access, including those whose health literacy is limited or marginal.
But Dr Alexander says clinicians have a key role to play in helping to redirect patients to reliable sources of information.
‘We should expect that patients will want to know more and read more around a specific health condition and thus give them reputable options as they walk out of our consulting room,’ she said.
‘We should also ask our patients where they are seeking information related to health so that we can offer more reliable alternatives and combat any misinformation they may have been exposed to.’
However, Dr Alexander acknowledges that there are bigger questions at play, with complicated answers that are related to a multitude of factors, from financial barriers to cultural ones.
‘Obviously the broader issues here are: how do we get the patients turning to these platforms into a consulting room with a qualified health professional instead, and how do we ensure information related to health on platforms involving AI are more reliable?’ she posed.
‘On social media alone, I see the amount of misinformation people are subjected to – it’s muddy and confusing for people – and ideally we should have some idea of what our patients are being exposed to so we can do a more effective job in the consulting room.’
The study authors note that the types of health questions that pose a higher risk for the community ‘will change as AI evolves’ and that identifying them will require further investigation.
‘There is an urgent need to equip our community with the knowledge and skills to use generative AI tools safely, in order to ensure equity of access and benefit,’ they said.
Log in below to join the conversation.
AI ChatGPT health literacy health misinformation
newsGP weekly poll
Do you support the Queensland Government’s decision to make its pharmacy prescribing pilot permanent?