GPs turn to AI to help with patient workload


The increasing workload of General Practitioners (GPs) in the UK has caused concern that administrative tasks are taking up too much of their time and reducing the quality of consultations. Luckily, AI is stepping in to alleviate some of the pressure. Dr Deepali Misra-Sharp, a GP partner in Birmingham, has been using Heidi Health, a free AI-assisted medical transcription tool that listens and transcribes patient appointments, for four months and has found it freeing up “two to three minutes per consultation, if not more”. She says she is now able to spend more time with the patient, free from the distractive task of note-taking. Corti, a Danish medical AI firm, has developed AI that listens to healthcare consultations and suggests follow-up questions, prompts, treatment options and even takes notes. The company processes about 150,000 patient interactions per day across hospitals, GP surgeries and healthcare institutions across Europe and the US, totalling about 100 million encounters per year.

A 2019 report by Health Education England estimated AI could save approximately one minute per patient, equating to 5.7 million hours of GP time. Meanwhile, research by Oxford University in 2020 found that 44% of all administrative work in General Practice can now be partially or completely automated, freeing up time for patient interaction. Currently, 1,400 GP practices across England use C the Signs, a platform using AI to analyse patients’ medical records and check different signs, symptoms and risk factors of cancer, and recommend what action should be taken.

Whilst AI has the potential to transform NHS care, it is important it is not used as a replacement. AI is “subject to bias and error, can potentially compromise patient privacy and is still very much a work-in-progress,” notes Dr. Katie Bramall-Stainer, Chair of General Practice Committee UK at the British Medical Association. Experts like Alison Dennis, partner and co-head of law firm Taylor Wessing’s international life sciences team, warn that GPs need to be cautious when using AI, as there is a high risk of incorrect diagnoses or treatment pathways being produced from unreliable data sets. Ensuring that AI tools are trained on reliable data sets and formally validated for clinical use are therefore important

Read the full article from The BBC here: Read More