Drugs Health Medical

WHO’s AI-based chatbot falls short on giving correct medical advice

HQ Team

April 18, 2024: The World Health Organisation’s foray into AI has run into trouble. The organization has launched SARAH, short for Smart AI Resource Assistant for Health, which seems to have limited resource data.

SARAH is a digital health promoter, available 24/7 in eight languages via video or text. It is part of WHO’s attempt to use AI to educate and help fill the primary health care worker shortage. The bot provides health tips to destress, on-road safety, to quit smoking, and suggestions to keep one healthy.

However, the AI avatar falls short on many fronts. The early prototype launched on April 2, is way behind similar AI-based generative bot avatars of WebMD and Google.

It tends to give some bizarre responses to simple health queries known as ‘hallucinations’ in AI terms. Most queries are referred to the WHO website or advises that users should “consult with your health-care provider.”

Limitations

SARAH is way behind WebMD and Google and is programmed to just give out basic information. In fact the WHO clearly states on its website that “the answers may not always be accurate because they are based on patterns and probabilities in the available data. WHO takes no responsibility for any conversation content created by Generative AI. Furthermore, the conversation content created by Generative AI in no way represents or comprises the views or beliefs of WHO, and WHO does not warrant or guarantee the accuracy of any conversation content.”

The WHO says SARAH is meant to work in partnership with research governments to provide accurate public health. The agency is asking them for advice on improving and using the bot in emergency services. “These technologies are not at the point where they are substitutes for interacting with a professional or getting medical advice from an actual trained physician or health provider, “said Alain Labrique, the director of digital health and innovation at WHO.

Moreover, SARAH is trained over OpenAI’s ChatGPT 3.5 version, which is updated till 2021, hence it is not equipped to answer queries on the latest research and data.

Many drugs that the USFDA has already approved are shown as still in the approval stage. A case in point is Alzheimer’s drug Lecanemab, which it says is still awaiting approval by the FDA, whereas it has already been passed for treatment in January of 2023. It sometimes glitches on WHO’s latest data on some world events too.

Medical chatbots’ efficacy

Medical AI chatbots have raised some concerns in the medical facilities A  study done on  ChatGPT’s ability to answer patient questions about medication, found the chatbot to be dangerously inept. Out of the 45 questions fed to ChatGPT, it could only answer a quarter correctly.

SARAH

SARAH appears to be a young white female, but Labrique says the avatar may change and evolve to suit preferences. SARAH herself denies any gender identification and identifies as a digital health promoter. SARAh needs to access your microphone and camera for 30 seconds. All conversation data collected is anonymized and complies with current privacy practices and regulations.

Also, the data access is restricted and not updated, hence researchers cannot rely on it to build prediction models

Sarah was created with technology developed by San Francisco and New Zealand-based Digital People company Soul Machines Limited with support from South African-based creative company Rooftop.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

X