According to a survey published in the journal BMJ Health and Care Informatics, one in five GPs are using AI tools such as ChatGPT to assist with tasks like generating documentation after patient appointments. The survey, which spoke to 1,006 GPs, found that almost a third of those using generative AI tools were using them for this purpose, while others were using the tools to suggest different diagnoses or treatment options.
While the researchers noted that GPs may find value in using these tools for administrative tasks and clinical reasoning support, they also raised concerns about potential risks to patient privacy. The researchers questioned how internet companies behind these AI tools use the information they gather and how legislation would intersect with their use in clinical practice.
Dr. Ellie Mein, a medico-legal adviser at the Medical Defence Union, highlighted potential issues with the use of AI by GPs, such as inaccuracy and patient confidentiality. She cautioned doctors about relying on AI programs to draft responses to patient complaints, noting that inaccuracies and privacy concerns can arise. Mein emphasized the importance of using AI ethically and in compliance with relevant guidance and regulations.
The researchers and Dr. Mein both stressed the need for greater awareness of the benefits and risks of using AI in healthcare settings. As the use of AI tools in clinical practice continues to evolve, it is important for healthcare professionals to consider ethical considerations and patient confidentiality when utilizing these technologies.
Source
Photo credit www.theguardian.com