News
Are AI scribes risking patient safety?
The college has released new guidance for using artificial intelligence in practice in a bid to help GPs safely navigate emerging technologies.
A new set of artificial intelligence (AI) scribe guidelines has been developed to help GPs balance convenience with the need for accuracy and patient safety.
Released on Tuesday, the new guidance comes as emerging technologies continue to change general practice and consulting, but also bring with them an added layer of complexity and consequences.
It states that GPs must be aware of three key issues when adopting AI: clinical, privacy and security, and workflow and practice.
AI scribe tools were designed to automate parts of the clinical documentation process, converting a conversation with a patient into a clinical note, summary, or letter.
However, the new guidance emphasises an AI scribe cannot completely replace the work of a GP, and that the onus remains on the doctor to ensure information is accurate.
‘GPs are ultimately responsible for ensuring that the patient health record is accurate and up to date,’ it says.
Dr Rob Hosking, Chair of the RACGP Expert Committee – Practice and Technology Management, told newsGP AI scribes are already changing the way many GPs conduct their consultations.
‘Some GPs currently type notes as they go during the consultation, others type the notes at the end whereas with the AI, you don’t need to do as much of that,’ he said.
‘For some GPs it can also change the way they consult in terms of how we communicate with our patients, because we tend to use a lot of lay terms rather than jargon, but we might record our notes with medical terminology.
‘If a doctor decides to change their style of practice and speak out loud the examination findings … that may inadvertently create more of a mismatch of knowledge when we spend a lot of our time with patients trying to bring that barrier down.’
There is a long list of benefits to using AI scribe software, including reducing administration burdens and allowing the GP to focus more on their patient during a consult.
However, there is also a long list of potential harms and problems, according to the RACGP’s new guidance.
Among these, it says, is the fact it is still an emerging field and ‘unforeseen legal problems might also arise as their use increases’.
RACGP President Dr Nicole Higgins said AI scribes can help to reduce the administrative burden for GPs as well as improving patient satisfaction, but must be used with caution.
‘The administrative burden on GPs needs to be reduced urgently – these tools will also allow GPs to focus on the patient instead of their computer during a consult, meaning happier patients,’ she said.
‘Everyone deserves the quality care that comes from having a GP who knows you, and your health history – AI can never replace this relationship, but it can help with administrative tasks, and this will help GPs focus more on our patients, which is what we want.’
Clinically, the guidelines caution the limited data currently available on the utility, validity, and patient safety of AI scribes.
This includes information which is not explicitly discussed, such as hospital discharge summaries, imaging reports, non-verbal cues from the patient, or data from medical devices.
It also warns that AI scribes can make errors which impact the meaning and accuracy of clinical information, mishearing symptoms, medicines, or conditions, or incorrectly categorising data.
‘As these tools gain popularity and their use increases, there is potential for GPs to become over-reliant on their use and pay less attention to critical clinical details or forgo the vital process of checking the output generated by the AI scribe, resulting in errors that could affect patient safety,’ it says.
Dr Hosking said it is still early days when it comes to AI and emerging technologies in general practice, and in time, he expects systems will become smarter.
‘They’re only going to get better with ongoing machine learning and work by the various vendors to integrate them with our system, so I suspect they will get better,’ he said.
‘But there’s always going to be that issue of what level of notes do you want to keep in terms of the terminology and technology versus what you’re trying to communicate with patients.
‘You’ve got to be careful, just like in our everyday lives, because are we running the risk of dumbing ourselves down and losing the skills?’
The RACGP has also listed a long list of privacy and security issues when using AI scribes, and states that GPs must obtain consent from a patient to use the tool in the consultation.
Some medical defence organisations also require GPs to obtain the written consent of the patient before recording the conversation.
‘A data breach might occur if audio recordings, text transcripts, or clinical documentation prepared by the AI scribe are intercepted or otherwise compromised,’ the guidelines state.
‘GPs considering purchasing an AI scribe should carefully review the terms and conditions of the user agreement to determine whether collected data will be used for secondary purposes by the vendor or a third party.’
The guidelines go on to recommend GPs or practices considering using AI scribe software to ask themselves the following questions:
- Are you satisfied you have been provided with enough evidence of the product’s safety, efficacy, and applicability?
- Is the AI scribe easy to use?
- How will I adjust clinical workflows to account for the time needed to check the clinical documentation generated?
- Have I informed patients about the use of the AI scribe at the practice?
Log in below to join the conversation.
AI artificial intelligence privacy technology
newsGP weekly poll
Do you use the GP Psychiatry Support Line?