News
Study finds 20% of UK GPs now using AI in practice
The research found some GPs are using unregulated technologies, such as ChatGPT, not just for admin, but in diagnosis and treatment.
Following the launch of ChatGPT at the end of 2022, interest in large language model-powered chatbots has soared.
New research has suggested that GPs are increasingly using artificial intelligence (AI) in their clinical practice, prompting warnings from experts.
The study, based in the United Kingdom, found one fifth of the GPs surveyed have already incorporated AI into their consults, despite a lack of formal guidance or regulation on how these tools should be used.
The researchers say their work shows that doctors and medical trainees need to be fully informed about the pros and cons of AI, especially due to the risks of inaccuracies, algorithmic biases and the potential to compromise patient privacy.
The study was carried out in February 2024 and asked 1006 GPs whether they have ever used AI chatbots, such as ChatGPT or Google’s Bard, in their work, and, if so, what they were used for.
One in five respondents reported using generative AI tools in their clinical practice.
Of these, 29% reported using these tools to generate documentation after patient appointments and 28% said they used them to suggest a differential diagnosis.
Concerningly, the study authors say that among the GPs using AI tools in practice, one quarter used it to suggest treatment options.
‘While these chatbots are increasingly the target of regulatory efforts, it remains unclear how the legislation will intersect in a practical way with these tools in clinical practice,’ the researchers say.
‘[These tools] may also risk harm and undermine patient privacy since it is not clear how the internet companies behind generative AI use the information they gather.’
Dr Rob Hosking, Chair of the RACGP Expert Committee – Practice and Technology Management, says in Australia he is aware of GPs using AI to help with administration, but warns against going any further at this stage.
‘We shouldn’t be using these tools until they have been validated against quality, manually used guidelines,’ he told newsGP.
‘If we get further down the track and the tools are validated against specific guidelines to provide clinical decision support, then that’s a different thing altogether.
‘That would need to be validated with the TGA [Therapeutic Goods Administration] before we can use them that way.’
Dr Hosking said the way the technology works currently is too broad to use for accurate clinical advice and poses too great a risk.
‘They’re using the whole internet as their source of data and there’s going to be some good stuff, there’s going to be some terrible stuff, and it’s not selective,’ he said.
‘We have seen examples where some of these large language models make up things, or “hallucinate”, and we don’t want people acting on that sort of advice until it’s been tested and quality approved.
‘If there’s a bad outcome based on that information, then legally it is probably not going to stand up because they’re using a tool for purposes it’s not designed for in Australia.’
The RACGP has recently released new guidelines on the use of AI for general practice, emphasising that the onus is still on doctors to ensure information is accurate.
The researchers in this study did acknowledge that survey respondents may not be completely representative of GPs in the UK, and that those who submitted responses may have been particularly interested in AI and potentially introduced a level of bias into the findings.
Dr Hosking said the move towards AI is a positive thing, especially for things like note taking, and he can see it helping more further down the track.
‘This stuff is a bit like the first time we moved from pen and paper to the computer,’ he said.
‘The AI scribes is a big jump in record keeping and the next step will be the use of AI to advise on certain treatments, medications and possible diagnoses, but that is still a bit away.
‘I suspect they’re getting better and moving forward quickly, but I would be reluctant to advise any of our members to use tools that haven’t been validated or checked.’
Log in below to join the conversation.
AI artificial intelligence digital health general practice privacy technology
newsGP weekly poll
Health practitioners found guilty of sexual misconduct will soon have the finding permanently recorded on their public register record. Do you support this change?