Leicestershire expert urges caution as half of medical advice from AI chatbots found to be misleading
It's after a new study found that half of their responses to health questions were inaccurate or misleading
People in the East Midlands are being warned not to rely on AI chatbots for medical advice, after a new study found that half of their responses to health questions were inaccurate or misleading.
Researchers tested five popular chatbots and found that none of them consistently provided safe, reliable guidance.
Experts say these chatbots often “hallucinate” and make up information if they do not know the answer.
Grant Ingrams from Leicester and Leicestershire's local medical committee explains:
“We had a patient who booked a non-existent appointment using AI."
"A patient turned up at the practice first thing in the morning saying, I have got an appointment with Dr. X, and the staff say, no, you haven’t—he pulls out his phone and says, look, and he’d use the AI on the phone. "
"You have to be really careful because the biggest thing about AI is it hallucinates. If it cannot find the answer, it makes it up.”
The study, published in BMJ Open, found chatbots performed best answering questions on vaccines and cancer but struggled with topics like stem cells, fitness and nutrition.
“It's a tool. And I think over time the tool will become useful, but the tool will always be there as an add-on to support and help rather than replace,” Grant adds.
Researchers say the rise of AI chatbots means public education and regulation are needed—especially as the technology is not licensed to give medical advice or access up-to-date information.