Advertising

Professional
Volume 54, Issue 5, May 2025

Is AI A-OK? Medicolegal considerations for general practitioners using AI scribes

Owen Bradfield    Patrick Mahar   
doi: 10.31128/AJGP-10-24-7438   |    Download article
Cite this article    BIBTEX    REFER    RIS

Background

Good medical records are an essential part of healthcare. However, the burden of clinical documentation can reduce clinician productivity and add to stress and anxiety. Artificial intelligence (AI) scribes offer a solution by using large language models to predict text and rapidly generate complex records. Important medicolegal issues should be considered before adopting AI scribes into clinical practice.

Objective

This article examines this rapidly emerging field, the medicolegal issues faced by general practitioners and provides references to relevant legislation, guidelines and cases.

Discussion

General practitioners (GPs) must understand how AI scribes work and test them before implementing them into their practice. Patient consent must be obtained before each use; data must be stored and retained in accordance with relevant privacy laws; and records must accurately reflect the content of the consultation, and comply with regulations governing their content and creation.

 

This article is part of a longitudinal series on artificial intelligence (AI).

Medical records are essential to the safe delivery of high-quality healthcare. They facilitate communication and continuity of care between practitioners, health services and patients, thereby reducing clinical errors.1 In addition to serving important research, quality improvement and disease-outbreak monitoring functions, they also provide valuable evidence in the assessment, prosecution and defence of medicolegal claims against general practitioners (GPs). There is a well-known axiom that ‘good records are a good defence, poor records are a poor defence, and no records are no defence’. Where there is a dispute between a GP and patient about what transpired during a consultation, courts will often prefer the evidence of the GP where it is supported by, and consistent with, what is documented in the medical record.2,3

Despite the benefits of good medical records, excessive time required to create and review them can adversely impact clinician and health service productivity.4 Clinicians often say that they spend more time documenting clinical encounters than they spend with patients.5 This ‘pain point’ in daily medical practice can amplify stress and burnout.6 In Australia, over the last 18 months, we have witnessed the advent of a potential solution with the rapid development and release of artificial intelligence transcription tools (hereafter described as ‘AI scribes’). AI scribes work by audio recording a consultation, converting speech to text, then adopting predictive models to generate a summarised output, such as a progress note, referral letter or report.7 Many AI scribes can be integrated into existing electronic medical records systems, making them easy to implement into standard clinical practice.

AI scribes have been labelled a ‘game-changer’,8 reducing stress levels9 and the time taken to document consultations by up to 40%.10 A survey of 1000 Australian GPs showed that AI scribe use has increased from <3% in May 2024 to 8.24% by October 2024,11 and they are being used in over 10,000 consultations per day.12 In the UK, at least one in five GPs are using an AI scribe on a daily basis.13 The combination of novel technology, evolving laws and regulations and increasing uptake makes it critical that GPs understand the medicolegal implications before implementing AI scribes in their practice. Even after implementation, AI scribes need to be continuously monitored to ensure that they operate as intended, and that they remain compliant with Australian laws and regulations. This article presents some of the critical medicolegal considerations for GPs to consider when using AI scribes, with reference to legislation, recent cases and guidelines.

Due diligence

In Australia, AI scribes do not meet the definition of ‘medical devices’ because they do not support diagnostic decision making.14 This currently makes them unregulated goods that are exempt from registration with the Therapeutic Goods Administration (TGA).15 Therefore, before incorporating AI scribes into clinical practice, GPs should conduct due diligence to ensure that the software has been tested and is suitable for its intended use, meets GPs’ clinical needs and is not also designed to support clinical decision making (in which case, the software may need to be TGA registered).

To better understand the product and its functionality, GPs can request a trial period, demonstration and/or training in the use of the AI scribe and should carefully review (and obtain legal advice on) any user or service agreements before purchasing a licence. In particular, GPs should be alert to indemnity clauses that seek to transfer liability away from the software vendor in the event of product failure or error. GPs should also consider the size and reputation of the software vendor and its capacity to protect the security of information. If a product licence is purchased, GPs should continue to monitor the performance of the AI scribe to ensure that it remains appropriate and compliant with the privacy obligations in the section on ‘Privacy laws’. The TGA is reviewing current frameworks regulating the use of AI in healthcare,16 and it is possible that regulatory settings will evolve as AI scribes become mainstream.

Privacy laws

A complex web of overlapping Commonwealth, state and territory privacy laws regulate the collection, use, disclosure, storage, retention and cross-border transfer of health and/or personal information in the Australian public and private sectors (see Table 1). These are relevant to the use of AI scribes.

First, when AI scribes listen to and transcribe clinical encounters, health and/or personal information is collected. Personal information is information or an opinion about an individual who is identifiable or reasonably identifiable, while health information is personal information relating to an individual’s health or information collected during the provision of healthcare.17–21 We hereafter refer to health and/or personal information as ‘personally identifiable information’ (PII). PII can be in written or audio form.22,23

Second, some AI scribes anonymise the content of audio and transcription files by redacting identifying information such as name and date of birth. If this results in the information being de-identified, then it would not be considered PII and privacy laws may not apply to its collection, use, disclosure, storage, retention or cross-border transfer. However, GPs need to be aware that merely removing the patient’s name or date of birth may be insufficient to de-identify PII if it is still reasonably re-identifiable. It is critical that GPs understand and check whether transcripts are de-identified, as this would impact on whether the further privacy obligations outlined below must be observed. GPs may need to obtain legal advice or speak with their medical defence organisation to fully understand these obligations.

Third, PII may only be used for the primary purpose for which it was collected, unless patients have consented to (or would reasonably expect) an unrelated secondary use. For example, the primary purpose of collecting PII using an AI scribe is to facilitate the provision of healthcare. However, some AI scribes are open source, meaning that they may use PII to train and improve the AI model, and make this training data publicly available. Privacy laws would not permit this unrelated secondary use or disclosure unless patients specifically consent or unless it was explicitly outlined in a privacy policy brought to the patient’s attention prior to collection or unless the information was sufficiently de-identified. If PII is shared to an open-source AI scribe, it becomes difficult to track, control or remove. Without fully understanding what happens to their PII, it may be challenging for patients to make informed decisions about the use of an AI scribe. Therefore, before purchasing an AI scribe licence, GPs should ask software vendors whether any secondary uses of PII are intended.

Fourth, some overseas-based AI scribes may store or process PII outside Australia. The Privacy Act 1988 (Cth) requires GPs using overseas-based AI scribes to either: 1) ensure that the overseas jurisdiction in which the information is stored or processed is subject to privacy laws that are substantially similar to Australian privacy laws (these countries will be prescribed by the Attorney-General through regulation); or 2) obtain consent from patients to the overseas transfer of their PII.24 Thus, while it is not unlawful to use overseas-based AI scribes, it increases the regulatory compliance burden for GPs and practices.

Finally, many available AI scribes do not store recordings of the consultation and delete the text transcripts either as soon as the output has been generated or within seven to 30 days. This is consistent with the Privacy Act 1988 (Cth), which requires that PII must be destroyed or de-identified when no longer required for its original purpose and there is no other legal requirement to retain it.25 However, in Victoria (Vic), New South Wales (NSW) and the Australian Capital Territory (ACT), there is a legal requirement to retain health information for at least seven years from when a patient was last seen or until a child reaches the age of 25 years.26 In those jurisdictions, if audio files or text transcripts contain health information that has not been incorporated into the AI-generated output, it would be unlawful in that jurisdiction to delete that health information prior to the expiration of these statutory timeframes. If a software vendor claims that only non-PII is deleted, GPs in Vic, NSW and ACT may need to check that transcripts are non-identifiable. On the other hand, if PII is deleted, GPs may need to check that all clinically relevant PII has been incorporated into the final output and, if not, to manually amend the output before finalising it. This is a complex issue and largely untested by the courts.

In August 2024, the Office of the Australian Information Commissioner (OAIC) released timely guidance on privacy and the use of commercially available AI products.27 The OAIC expects organisations adopt a cautious approach to using AI. It recommends that organisations update their privacy policies with clear and transparent information about their use of AI. It also recommends that PII not be entered into publicly available AI tools, as individuals may lose control of their PII and it may increase the risk of data breaches from unauthorised access or cyberattacks. The OAIC also warns users about the risk of algorithmic bias that can lead to systematic output errors that may disproportionately impact vulnerable groups, such as children or Aboriginal and Torres Strait Islander peoples. This may be especially relevant in a healthcare context.

In September 2024, the Deputy Commissioner of the Victorian Office of the Information Commissioner investigated the Victorian Department of Families, Fairness and Housing (DFFH) after a child protection worker (CPW) used ChatGPT to draft a child protection report that was submitted to the Children’s Court in a case about whether a child required state protection.28 In that case, the CPW entered personal and sensitive information about the child and the child’s family into ChatGPT, including names and risk assessments. The Deputy Commissioner found that the AI-generated report contained significant errors that had not been checked and this had the potential to seriously mislead the Court. Furthermore, the Deputy Commissioner found that, in using ChatGPT, the CPW had disclosed without consent personal and sensitive information to an overseas company (Open AI) that could further use and disclose that information for unrelated secondary purposes. This amounted to a serious breach of Victoria’s Information Privacy Principles (IPPs). The Deputy Commissioner issued a compliance notice banning the use of ChatGPT and similar tools by CPWs.

Table 1. Legislation governing health informationA
Jurisdiction Public sector Private sectorA
Cth
  • Freedom of Information Act 1982 (Cth)
  • My Health Records Act 2012 (Cth)
  • Privacy Act 1988 (Cth)
  • Health Identifiers Act 2010 (Cth)
  • Privacy Act 1988 (Cth)
  • My Health Records Act 2012 (Cth)
  • Health Identifiers Act 2010 (Cth)
ACT
  • Freedom of Information Act 2016 (ACT)
  • Health Records (Privacy and Access) Act 1997 (ACT)
  • Health Records (Privacy and Access) Act 1997 (ACT)
  • Privacy Act 1988 (Cth)
NSW
  • Government Information (Public Access) Act 2009 (NSW)
  • Health Records and Information Privacy Act 2002 (NSW)
  • Health Records and Information Privacy Act 2002 (NSW)
  • Privacy Act 1988 (Cth)
NT
  • Information Act 2002 (NT)
  • Health Service Act 2021 (NT)
  • Privacy Act 1988 (Cth)
  • Health Service Act 2021 (NT)
Qld
  • Information Privacy Act 2009 (Qld)
  • Right to Information Act 2009 (Qld)
  • Public Records Act 2002 (Qld)
Privacy Act 1988 (Cth)
SA
  • Freedom of Information Act 1991 (SA)
  • Health Care Act 2008 (SA)
  • Privacy Principles Instruction (Premier and Cabinet Circular PC012)
Privacy Act 1988 (Cth)
Tas
  • Personal Information Protection Act 2004 (Tas)
  • Right to Information Act 2009 (Tas)
Privacy Act 1988 (Cth)
Vic
  • Health Records Act 2001 (Vic)
  • Health Services Act 1988 (Vic)
  • Freedom of Information Act 1982 (Vic)
  • Public Records Act 1973 (Vic)
  • Health Records Act 2001 (Vic)
  • Privacy Act 1988 (Cth)
WA
  • Freedom of Information Act 1992 (WA)
  • State Records Act 2000 (WA)
  • Privacy Act 1988 (Cth)
  • Privacy and Responsible Information Sharing Act 2024 (WA)
APrivate sector privacy laws are most relevant to general practitioners (GPs) who predominantly work in privately owned practices.
ACT, Australian Capital Territory; Cth, Commonwealth; NSW, New South Wales; NT, Northern Territory; Qld, Queensland; SA, South Australia; Tas, Tasmania; Vic, Victoria; WA, Western Australia.

Surveillance devices laws

To comply with Australian privacy laws, PII must only be collected by lawful means. Legislation in each Australian state and territory defines ‘listening devices’ and protects individuals from invasions of privacy through the use of listening devices by criminalising their unlawful use. These laws are summarised in Table 2. As AI scribes record private conversations, they are likely to meet the definition of a listening device, and their use without patient consent could breach surveillance devices laws. For example, in Toth v DPP,29 a patient was found guilty of an offence under the Surveillance Devices Act 2007 (NSW) for secretly recording a clinical consultation without the GP’s knowledge or consent. GPs could be at risk of similar criminal sanctions for recording consultations using AI scribes without patient consent, or sending clinical information to a colleague based on a consultation conducted using an AI scribe without patient consent.

The use of AI scribes in relation to telehealth consultations is also complex, as GPs physically located in an Australian jurisdiction should be aware that the privacy and/or surveillance devices laws of another Australian or overseas jurisdiction may also apply where the patient is physically located interstate or overseas. In addition, the use of an AI scribe during a telehealth consultation without patient consent could amount to the interception of a telecommunication in breach of Section 7 of the Telecommunications (Interception and Access) Act 1979 (Cth). To ensure that GPs do not breach these laws, consent must be obtained.

Table 2. Legislation governing listening devicesA
Jurisdiction Legislation Use of listening devices without consent from a party to the conversation Communication or publication of recordings from listening devices without consent from a party to the conversation
ACT Listening Devices Act 1992 (ACT) Prohibited Prohibited
NSW Surveillance Devices Act 2007 (NSW) Prohibited Prohibited
NT Surveillance Devices Act 2007 (NT) Not prohibitedB Prohibited
Qld Invasion of Privacy Act 1971 (Qld) Not prohibitedB Prohibited
SA Surveillance Devices Act 2016 (SA) Prohibited Prohibited
Tas Listening Devices Act 1991 (Tas) Prohibited Prohibited
Vic Surveillance Devices Act 1999 (Vic) Not prohibitedB Prohibited
WA Surveillance Devices Act 1998 (WA) Prohibited Prohibited
AThe definition of a listening device is largely consistent across Australian jurisdictions and ‘includes any instrument, apparatus, equipment or device capable of being used to overhear, record, monitor or listen to a conversation or words spoken to or by any person in conversation, but does not include a hearing aid’. See, for example, the Surveillance Devices Act 2007 (NSW), s 4.
BOnly prohibited if the other person is not a party to the conversation.
ACT, Australian Capital Territory; Cth, Commonwealth; NSW, New South Wales; NT, Northern Territory; Qld, Queensland; SA, South Australia; Tas, Tasmania; Vic, Victoria; WA, Western Australia.

Patient consent

As described above, GPs must obtain patient consent before using an AI scribe. This not only protects GPs from breaching surveillance devices and privacy laws, but is also an important ethical obligation to ensure transparency and to engender patient trust in the technology and the therapeutic alliance.30 Box 1 sets out some of the key matters that patients must consent to when using an AI scribe. Some software vendors helpfully provide prompts that require clinicians to confirm that they have sought consent before each consultation can be transcribed.31 Some also provide detailed patient information and written consent forms.32 While written consent forms are helpful in demonstrating that consent has been obtained, a verbal discussion should also occur in which patients are offered an opportunity to ask questions.

If consent is obtained, GPs should document the details of the information provided to the patient and the patient’s consent (or refusal of consent) in the medical record. This also serves as a record that an AI scribe was or was not used in the generation of the record. Patients should be given the opportunity to withdraw their consent at any time and re-consenting should occur prior to each consultation using an AI scribe, as the recording of each consultation without consent may be a separate criminal offence in some Australian jurisdictions. While the matters in Box 1 should be canvassed with patients prior to the first use of an AI scribe, the discussion and consent process prior to subsequent consultations is likely to be limited to reminding patients about the use of the AI scribe and providing an opportunity for the patient to ask questions or opt out. Where patients withhold consent to the use of an AI scribe, the consultation should be documented in the usual manner.

To support the consent process, patients should also be made aware of the proposed use of AI scribes ahead of the consultation, such as through the clinic’s website, booking engine, privacy policy or notice, new patient registration forms or information pamphlets. Where patients lack the capacity to consent due to age, illness or disability, consent should be sought from the patient’s substitute decision maker, parent or guardian, in accordance with legal requirements in each Australian state or territory.

Box 1. Informed consent to AI scribe use
To be able to provide informed consent to the use of an AI scribe, patients must understand all the following:
  • That the purpose of using an AI scribe is to assist the GP to document the clinical consultation.
  • That the AI scribe will audio record (or listen to) the consultation and convert spoken word into text.
  • That an audio file and/or text transcript will be retained and processed by a third-party organisation to generate the AI output.
To assist patients to understand what happens to their PII, GPs must understand all the following:
  • What information will be retained by the third-party (ie audio file, text transcript, or both).
  • For how long this information will be retained.
  • Whether or not the information retained will be de-identified.
  • Where information will be stored (ie within or outside Australia).
  • What measures are in place to encrypt and securely transfer data.
  • Whether there are any intended secondary uses of the data (eg sharing with an open-source AI model for learning purposes).
Patients should also be given the opportunity to ask questions and should be re-consented prior to each consultation so they can opt out at any time.
PII, personally identifiable information.

Accuracy

Although AI scribes continue to improve, one of their persisting limitations is the risk of false or misleading outputs, known as ‘hallucinations’.33 Hallucination rates across popular AI tools vary from 0.8% to 3.9%.34 Errors may arise when AI scribes mishear and/or incorrectly transcribe spoken words, and when they fail to detect the clinically relevant aspects of a history. These errors can compromise patient safety (when other practitioners caring for a patient rely on inaccurate health information) or data breaches (when PII is disclosed to the incorrect recipient when patient or practitioner contact details are wrong).35 In all, over-reliance on the accuracy of tools that ordinarily perform well is a danger, especially when GPs are under significant time pressure.

AI scribes generate outputs based on what is heard and transcribed during a consultation. Hence, outputs may also be inaccurate or incomplete when parts of the consultation are unspoken or rely on information from extraneous sources (such as discharge summaries or test results). GPs need to consider this limitation by either verbalising additional aspects of the consultation (such as physical examination findings and test results) or by manually adding further clinical information to the output before it is finalised. Although verbalisation of the consultation may assist patients to understand GPs’ clinical reasoning, it substantially changes the manner in which consultations are conducted and may not always be appropriate (such as verbalising the patient’s general appearance, weight, behaviours or sensitive issues, such as mental health or suspected child abuse). GPs may need to adjust their consulting style to the circumstances of the consultation and exercise discretion.

Ultimately, it is crucial that GPs remain vigilant and check the accuracy of all AI-generated outputs after every recorded consultation. If the output generated does not accurately reflect the content of the consultation, the GP must edit, amend or correct the output before it is finalised. This is consistent with GPs’ legal and professional obligations under various Australian privacy laws,36 and the Medical Board of Australia’s Code of Conduct (see Box 2).37 Although there are no published cases involving medical practitioners, there have been two recent and widely reported cases involving lawyers in Vic38 and NSW39 who used AI to generate court documents that contained serious errors, including fictitious cases. Both lawyers were criticised by the courts for failing to check the accuracy of those documents, and were referred to the relevant state legal complaints commissions. This has prompted legal regulators to send a clear warning to the legal profession about the importance of checking all AI-generated outputs for accuracy.40

 

Box 2. Medical Board of Australia – Good medical practice: A code of conduct for doctors in Australia37
10.5 Medical records
Maintaining clear and accurate medical records is essential for the continuing good care of patients. Good medical practice involves:
  • keeping accurate, up to date and legible records that report relevant details of clinical history, clinical findings, investigations, diagnosis, information given to patients, medication, referral and other management in a form that can be understood by other health practitioners
  • ensuring that your medical records are held securely and are protected against unauthorised access
  • ensuring that your medical records show respect for your patients and do not include demeaning or derogatory remarks
  • ensuring that the records are sufficient to facilitate continuity of patient care
  • making records at the time of the events, or as soon as possible afterwards
  • dating any changes and additions to medical records, including when the record is electronic
  • recognising patients’ right to access information contained in their medical records and facilitating that access
  • promptly facilitating the transfer of health information when requested by the patient or third party with requisite authority
  • retaining records for the period required by law and ensuring they are destroyed securely when they are no longer required.

Professional guidance

In response to the rapid integration of AI into healthcare, regulators, professional bodies and courts are creating guidance for clinicians.

The Australian Health Practitioner Regulation Agency (Ahpra) has created a guideline that explains how the Code of Conduct applies to practitioners using AI scribes.41 It states that practitioners must: understand and explain to patients the privacy implications of using AI scribes; obtain informed consent before using AI; ensure outputs are culturally safe; and always apply human judgment to AI outputs by checking for accuracy and relevance. Ahpra also warns practitioners against entering confidential patient information into open-source AI tools (such as ChatGPT) where health information can be stored overseas indefinitely for later public use.

Similarly, The Royal Australian College of General Practitioners has released a fact sheet for GPs.42 It advocates a cautious approach, given the limited data on the safety, utility, validity, efficacy or applicability of AI scribes in an Australian GP context. It recommends that GP clinics develop their own policies and procedures to decide if, when and how AI scribes will be deployed. This includes the need for training and additional IT security to prevent data breaches. It also recommends that, before purchasing a product, GPs talk with software vendors to ensure they understand how it works and any potential impact that altered clinical workflows may have on daily practice.

Finally, recent directions from the Chief Justice of the NSW Supreme Court prevent expert witnesses from using AI to draft or prepare an expert report (in whole or in part) without prior court permission.43 If the court grants leave, the report must state which parts of the report were generated using AI, how the AI scribe was used and what codes (if any) regulate the expert’s use of the AI tool. Lawyers commissioning expert opinions have an obligation to bring these directions to the expert’s attention. In Victoria, witnesses are encouraged to disclose to courts if AI is used in the preparation of a court document.44

Conclusion

The rise of AI scribes over the last 18 months has been unprecedented. By expediting the creation of medical records, AI scribes can reduce the administrative burden for busy GPs, potentially enhancing patient safety and GP wellbeing and productivity. However, to harness these powerful new tools, GPs need to understand the medicolegal issues that may arise before adopting them into their practice. For those who are already using these tools, this paper serves to highlight the importance of GPs regularly reviewing the performance of these tools and their compliance with Australian laws and regulations.

Before implementing an AI scribe into clinical practice, GPs need to ensure that they are prepared: new technologies may impact workflows; potential products should be trialled; user agreements should be carefully reviewed; staff require training; and privacy policies need updating. GPs also need to understand how patient information will be handled by the AI scribe (whether, what, where and for how long data is stored) so that they can explain this to patients, who must consent. Additional components of consultations may need to be verbalised and the accuracy of all AI-generated outputs must be checked for accuracy and appropriateness.

Even after AI scribes have been embedded into practice, their use and performance must be regularly reviewed to ensure ongoing suitability and legal compliance. As experience with AI scribes increases, our understanding of the benefits and risks will deepen. Simultaneously, there will be continued legal and regulatory developments, as more cases, guidelines and laws seek to further regulate this burgeoning area of healthcare practice. GPs need to keep abreast.

Key points

  • AI scribes can save time, increase productivity and reduce GP stress.
  • Before using an AI scribe, GPs should test the product, ensure that it is designed for a healthcare context and check that information is not shared publicly.
  • Patient consent must be obtained before each consultation using an AI scribe.
  • GPs must understand what happens to patient information after collection so this can be effectively explained to patients.
  • Every AI output must be checked for accuracy before being finalised.
Disclaimer

The information contained in this article is intended to be general information only and should not be relied upon as legal advice. You should seek legal or other professional advice before relying on any content and exercise proper clinical decision-making with regard to your individual circumstances. Information is current at the date of acceptance of this article.

Competing interests: None
Provenance and peer review: Commissioned, externally peer reviewed
AI declaration: The authors confirm that there was no use of artificial intelligence (AI)-assisted technology for assisting in the writing or editing of the manuscript and no images were manipulated using AI.
Funding: None.
Correspondence to:
obradfield@mips.com.au
This event attracts CPD points and can be self recorded

Did you know you can now log your CPD with a click of a button?

Create Quick log
References
  1. Abdelrahman W, Abdelmageed A. Medical record keeping: Clarity, accuracy, and timeliness are essential. BMJ 2014;348:f7716. doi: 10.1136/bmj.f7716. Search PubMed
  2. Kite v Malycha [1998] 71 SASR 321. Search PubMed
  3. Tai v Hatzistavrou [1999] NSWCA 306. Search PubMed
  4. Ali SR, Dobbs TD, Hutchings HA, Whitaker IS. Using ChatGPT to write patient clinic letters. Lancet Digit Health 2023;5(4):e179–81. doi: 10.1016/S2589-7500(23)00048-1. Search PubMed
  5. Siegler JE, Patel NN, Dine CJ. Prioritizing paperwork over patient care: Why can’t we do both? J Grad Med Educ 2015;7(1):16–18. doi: 10.4300/JGME-D-14-00494.1. Search PubMed
  6. Hawkins M. 2018 Survey of America’s physicians: Practice patterns & perspectives. The Physician’s Foundation, 2018. Available at http://physiciansfoundation.org/wp-content/uploads/physicians-survey-results-final-2018.pdf [Accessed 10 March 2025]. Search PubMed
  7. Knibbs J. First generative AI app to ease GP consult admin. The Medical Republic, 2023. Available at www.medicalrepublic.com.au/first-generative-ai-app-to-ease-gp-consult-admin/17610 [Accessed 10 March 2025]. Search PubMed
  8. Woodrow L, John W. Lyrebird AI a ‘gamechanger’ for GPs. The Medical Republic, 2024. Available at www.medicalrepublic.com.au/lyrebird-ai-a-gamechanger-for-gps/104917 [Accessed 10 March 2025]. Search PubMed
  9. Payne H. Docs using AI spent just as much time with patients. The Medical Republic, 2024. Available at www.medicalrepublic.com.au/docs-using-ai-spent-just-as-much-time-with-patients [Accessed 10 March 2025]. Search PubMed
  10. Barak-Corren Y, Wolf R, Rozenblum R, et al. Harnessing the power of generative AI for clinical summaries: Perspectives from emergency physicians. Ann Emerg Med 2024;84(2):128–38. doi: 10.1016/j.annemergmed.2024.01.039. Search PubMed
  11. Knibbs JGP. AI scribe use more than doubles in four months. The Medical Republic, 2024. Available at www.medicalrepublic.com.au/gp-ai-scribe-use-more-than-doubles-in-four-months [Accessed 10 March 2025]. Search PubMed
  12. Basu M. AI producing medical notes for more than 10,000 consults a day, says Lyrebird boss. Australian Doctor, 2024. Available at www.ausdoc.com.au/news/ai-producing-medical-notes-for-more-than-10000-consults-a-day-says-lyrebird-boss [Accessed 10 March 2025]. Search PubMed
  13. Woodrow L. AI scribe uptake could rival the trusty stethoscope. The Medical Republic, 2025. Available at www.medicalrepublic.com.au/ai-scribe-uptake-could-rival-the-trusty-stethoscope/115311 [Accessed 21 March 2025]. Search PubMed
  14. Therapeutic Goods Act 1989 (Cth), s 41BD. Search PubMed
  15. Therapeutic Goods Administration. Artificial Intelligence (AI) and medical device software. TGA, 2024. Available at www.tga.gov.au/how-we-regulate/manufacturing/manufacture-medical-device/manufacture-specific-types-medical-devices/artificial-intelligence-ai-and-medical-device-software#when-ai-is-considered-a-medical-device- [Accessed 10 March 2025]. Search PubMed
  16. Therapeutic Goods Administration. Consultation: Clarifying and strengthening the regulation of artificial intelligence (AI). TGA, 2024. Available at www.tga.gov.au/resources/consultation/consultation-clarifying-and-strengthening-regulation-artificial-intelligence-ai [Accessed 10 March 2025]. Search PubMed
  17. Privacy Act 1988 (Cth) s 6FA. Search PubMed
  18. My Health Records Act 2012 (Cth) s 5. Search PubMed
  19. Health Records and Information Privacy Act 2002 (NSW) s 6. Search PubMed
  20. Health Records Act 2001 (Vic) s 3. Search PubMed
  21. Health Records (Privacy and Access) Act 1997 (ACT) s 4. Search PubMed
  22. Privacy Act 1988 (Cth) s 6. Search PubMed
  23. Office of the Australian Information Commissioner. What is ‘personal information’? OAIC, 2017. Available at www.oaic.gov.au/privacy/privacy-guidance-for-organisations-and-government-agencies/handling-personal-information/what-is-personal-information [Accessed 10 March 2025]. Search PubMed
  24. Privacy Act 1988 (Cth), Sch. 1 cl. 8, as amended on 29 November 2024 by the Privacy and Other Legislation Amendment Act 2024 (Cth), s 36. Search PubMed
  25. Privacy Act 1988 (Cth), Sch. 1 cl.11.2. Search PubMed
  26. Health Records and Information Privacy Act 2002 (NSW), s 25(1); Health Records Act 2001 (Vic), Sch.1 cl. 4.2(b); Health Records (Privacy and Access) Act 1997 (ACT), Sch.1 cl. 4.1(3)(b) Search PubMed
  27. Office of the Australian Information Commissioner. Guidance on privacy and the use of commercially available AI products. OAIC, 2024. Available at www.oaic.gov.au/privacy/privacy-guidance-for-organisations-and-government-agencies/guidance-on-privacy-and-the-use-of-commercially-available-ai-products#section-interaction-with-voluntary-ai-safety-standard [Accessed 10 March 2025]. Search PubMed
  28. Office of the Victorian Information Commissioner. Investigation into the use of ChatGPT by a child protection worker. OVIC, 2024. Available at https://ovic.vic.gov.au/regulatory-action/investigation-into-the-use-of-chatgpt-by-a-child-protection-worker [Accessed 10 March 2025]. Search PubMed
  29. Toth v Director of Public Prosecutions (NSW) [2014] NSWCA 133. Search PubMed
  30. Hatherley JJ. Limits of trust in medical AI. J Med Ethics 2020;46(7):478–81. doi: 10.1136/medethics-2019-105935. Search PubMed
  31. Lyrebird Health. Obtain patient consent. Lyrebird Health, 2024. Available at https://help.lyrebirdhealth.com/en/articles/9229843-obtain-patient-consent [Accessed 10 March 2025]. Search PubMed
  32. Lyrebird Health. Written Consent Form. Available at https://help.lyrebirdhealth.com/en/articles/9229843-obtain-patient-consent [Accessed 10 March 2025]. Search PubMed
  33. Templin T, Perez MW, Sylvia S, Leek J, Sinnott-Armstrong N. Addressing 6 challenges in generative AI for digital health: A scoping review. PLOS Digit Health 2024;3(5):e0000503. doi: 10.1371/journal.pdig.0000503. Search PubMed
  34. Vectara. Hallucination Leaderboard. GitHub, 2025. Available at https://github.com/vectara/hallucination-leaderboard [Accessed 10 March 2025). Search PubMed
  35. As per the mandatory data breach notifications scheme requirements of Part IIIC of the Privacy Act 1988 (Cth). Search PubMed
  36. Privacy Act. (Cth), Schedule 1 APP 13.1; Health Records and Information Privacy Act 2002 (NSW), Schedule 1 HPP 9; Health Records Act 2001 (Vic), Schedule 1 HPP 3.1; Health Records (Privacy and Access) Act 1997 (ACT), Schedule 1. Principle 1988;3:1 (c). Search PubMed
  37. Medical Board of Australia. Good medical practice: A code of conduct for doctors in Australia. Ahpra, 2020. Available at www.medicalboard.gov.au/codes-guidelines-policies/code-of-conduct.aspx [Accessed 10 March 2025]. Search PubMed
  38. Taylor J. Melbourne lawyer referred to complaints body after AI generated made-up case citations in family court. The Guardian, 2024. Available at www.theguardian.com/law/2024/oct/10/melbourne-lawyer-referred-to-complaints-body-after-ai-generated-made-up-case-citations-in-family-court-ntwnfb [Accessed 10 March 2025]. Search PubMed
  39. Taylor J. Australian lawyer caught using ChatGPT filed court documents referencing non-existent cases. The Guardian, 2025. Available at www.theguardian.com/australia-news/2025/feb/01/australian-lawyer-caught-using-chatgpt-filed-court-documents-referencing-non-existent-cases [Accessed 10 March 2025]. Search PubMed
  40. The Law Society of NSW; the Legal Practice Board of Western Australia; Victorian Legal Services Board and Commissioner. Statement on the use of artificial intelligence in Australian legal practice. Victorian Legal Services Board and Commissioner, 2024. Available at www.lsbc.vic.gov.au/news-updates/news/statement-use-artificial-intelligence-australian-legal-practice [Accessed 10 March 2025]. Search PubMed
  41. Australian Health Practitioner Regulation Agency. Meeting your professional obligations when using artificial intelligence in healthcare. Ahpra, 2024. Available at www.ahpra.gov.au/Resources/Artificial-Intelligence-in-healthcare.aspx [Accessed 10 March 2025]. Search PubMed
  42. The Royal Australian College of General Practitioners. Artificial intelligence (AI) scribes. RACGP, 2025. Available at www.racgp.org.au/running-a-practice/technology/business-technology/artificial-intelligence-ai-scribes [Accessed 10 March 2025]. Search PubMed
  43. Supreme Court of NSW. Supreme Court practice note SC gen 23: Use of generative artificial intelligence (gen AI). Supreme Court of NSW, 2025. Available at https://supremecourt.nsw.gov.au/documents/Practice-and-Procedure/Practice-Notes/general/current/PN_Generative_AI_21112024.pdf [Accessed 10 March 2025). Search PubMed
  44. Supreme Court of Victoria. Guidelines for litigants: Responsible use of artificial intelligence in litigation. Supreme Court of Victoria, 2024. Available at www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation [Accessed 10 March 2025]. Search PubMed

Artificial intelligenceHealth systemsMedical recordsMedicolegal jurisprudence

Download article