Two in Five Australian GPs Now Use AI Scribes to Record Patient Notes
Doctors say it frees them to connect with patients but advocates warn of privacy and accuracy risks
AI scribes work by recording doctor-patient conversations and using large language models to generate clinical notes, summaries, and even referral letters. The technology has gained rapid adoption in Australia, outpacing regulatory frameworks designed to govern its use.
Proponents argue that AI scribes eliminate the burden of note-taking, allowing GPs to maintain eye contact and engage more deeply with patients rather than typing during consultations. Some report saving 30 to 60 minutes per day on administrative work.
However, patient advocacy groups have raised concerns about consent, data storage, and the accuracy of AI-generated medical records. Questions remain about what happens when an AI misinterprets a clinical conversation, who is liable for errors, and whether patients fully understand what they are consenting to when told an AI is listening.
Analysis
Why This Matters
The rapid adoption of AI in clinical settings is outpacing the guardrails. Medical records are among the most sensitive personal data, and errors in AI-generated notes could have serious health consequences.
Background
AI scribe tools like Nabla, Heidi, and others have proliferated in the Australian healthcare market over the past 18 months. The RACGP has issued guidelines but there is no mandatory regulatory framework.
Key Perspectives
Doctors see it as a productivity tool that improves patient interaction. Privacy advocates see an unregulated surveillance technology in sensitive settings. Patients are largely unaware of how the technology works.
What to Watch
Whether the TGA or AHPRA move to regulate AI scribes specifically, and whether any adverse events linked to AI-generated clinical notes emerge.