GPs turn to AI to help with patient workload
This is the fifth feature in a six-part series looking at how AI is changing medical research and treatment.
Difficulty getting an appointment with a GP is a familiar problem in the UK.
Even if the appointment is secured, Increasing workload on doctors This means those meetings may be shorter than the doctor or patient wishes.
But Dr Deepali Mishra-Sharp, a GP partner in Birmingham, has found that AI has removed a huge part of the administration from her work, meaning she can focus more on patients.
Dr. Mircea-Sharp started using Heidi Health, a free AI-assisted medical transcription tool that listens to patients’ appointments and transcribes them, about four months ago and says it’s made a big difference. Has come.
“Normally when I’m with a patient, I’m writing things down and it takes away from the consultation,” she says. “Now that means I spend all my time face to face with the patient. Can spend time tuning in and actively listening. “It provides more quality advice.”
She says the technology streamlines her workflow, saving her “two to three minutes per consultation, if not more.” She explains other benefits: “It reduces the risk of errors and omissions in my medical note taking.”
With the workforce declining while patient numbers continue to rise, GPs face enormous pressures.
A full-time GP is now responsible for 2,273 patients, a 17% increase on September 2015. According to the British Medical Association (BMA),
Could AI be the solution to helping GPs cut down on administrative tasks and reduce fatigue?
Some research suggests this may happen. A report from 2019 Estimates prepared by Health Education England suggest new technologies such as AI will save a minimum of one minute per patient, equivalent to 5.7 million hours of GP time.
During this time, Oxford University research In 2020, it was found that 44% of all administrative tasks in general practice could now be either mostly or completely automated, freeing up time to spend with patients.
One company working on this is Denmark’s Corti, which has developed AI that can listen to healthcare consultations over the phone or in person, and suggest follow-up questions, prompts, treatment options, as well as automated note taking .
Corti says its technology processes around 150,000 patient interactions per day in hospitals, GP surgeries and health care institutions across Europe and the US, totaling around 100 million per year.
“The idea is that the physician can spend more time with the patient,” says Lars Maloe, Corti’s co-founder and chief technology officer. He says the technology can suggest questions based on previous conversations overheard in other health care situations.
“The AI has access to related conversations and then it can think, okay, in 10,000 similar conversations, most of the questions have been asked by X and not that one,” Mr. Malloy says.
“I think GPs have back-to-back consultations and so have little time to consult with colleagues. It’s giving advice to that colleague.”
They also say that it can look at a patient’s historical data. “It might ask, for example, do you remember asking if the patient is still suffering from pain in the right knee?”
But do patients want technology to listen to and record their conversations?
“The data is not leaving the system,” Mr. Malloy says. However, he says it is good practice to inform the patient.
“If the patient objects to it, the doctor cannot record. “We see some examples of this because patients can see better documentation.”
Dr. Mishra-Sharp says she tells patients she has a listening device to help them take notes. “I haven’t seen anyone have a problem with it yet, but if they did, I wouldn’t do it.”
Meanwhile, currently, 1,400 GP practices across England are using See the Signs, a platform that uses AI to analyze patients’ medical records and screen for various signs, symptoms and risk factors of cancer. does, and advises what action should be taken.
Dr B Bakshi, chief executive and co-founder of See The Signs, says, “It can capture symptoms like cough, cold, inflammation and essentially in a minute see if there is any relevant information from their medical history or No.” A GP.
The AI is trained on published medical research papers.
“For example, it may say the patient is at risk for pancreatic cancer and would benefit from a pancreatic scan, and then the doctor will decide to refer those pathways,” says Dr. Bakshi. “It won’t diagnose, but it may provide comfort.”
She says they have conducted more than 400,000 cancer risk assessments in real-world settings, diagnosing more than 30,000 patients with more than 50 different types of cancer.
An AI report published by the BMA this year found that “AI should be expected to replace rather than replace health care jobs by automating routine tasks and improving efficiency”.
In a statement, Dr Katie Bramall-Steiner, chair of the General Practice Committee UK at the BMA, said: “We believe that AI has the potential to completely transform NHS care – but if it is not implemented safely, So it can also cause considerable harm, AI is subject to bias and error, potentially compromising patient privacy and it is still very much a work in progress.
“While AI can be used to enhance and complement what GPs can offer as another tool in their arsenal, it is not a silver bullet. We can provide much-needed productivity, sustainability and safety improvements “Can’t wait on the promise of AI tomorrow to deliver on the needs today.”
Alison Dennis, partner and co-head of law firm Taylor Wessing’s international life sciences team, warns that GPs need to tread carefully when using AI.
Ms. Dennis says, “Generative AI tools can lead to complete and total misdiagnosis, or not providing the correct diagnosis or treatment pathway and even providing the wrong diagnosis or treatment pathway i.e. creating hallucinations or being clinically incorrect.” The risk of providing output based on training data is too high.”
“AI tools that have been trained on reliable data sets and then fully validated for clinical use – which will almost certainly be a specific clinical use, clinical are more appropriate in practice.”
She says specialist medical products should be regulated and receive some form of official recognition.
“The NHS will also want to ensure that all data input into the tool is kept securely within the NHS system infrastructure, and is not absorbed for further use by the provider of the tool as training data without appropriate GDPR (General Data Protection Regulation) security measures are in place.”
For now, for GPs like Mishra-Sharp, it has transformed their work. “It has inspired me to enjoy my consultations again rather than feeling time pressured.”