latest

Embarrassing health condition? Talk to an AI chatbot

AI chatbots could be used to encourage patients to seek help for stigmatising health conditions, suggests new research

23rd December 2021 about a 2 minute read
“Many AI developers need to assess whether their AI-based healthcare tools such as symptoms checkers or risk calculators are acceptable interventions. Our research finds that patients value the opinion of health care professionals, therefore implementation of AI in health care may not be suitable in all cases, especially for serious illnesses.”  Dr Tom Nadarzynski, lead author of the study from the University of Westminster

Patients are more ready to discuss embarrassing or stigmatising health conditions with an artificial intelligence (AI) chatbot than with a health care professional, according to research from the University of Westminster and University College London.

Researchers looked at how willing patients were to receive health information from particular sources, depending on the stigma and severity of the condition. They found that for highly stigmatising health conditions, such as sexually transmitted infections, patients preferred to speak to a chatbot than a GP. For severe health conditions, such as cancer, they preferred to consult a GP.

Although on the whole, health care professionals were perceived as the most desirable sources of health information, the research suggested that chatbots could be helpful in encouraging patients to talk about conditions they don’t feel comfortable discussing with their GP.

AI not acceptable in all cases

The research, published in SAGE Digital Health, was based on online survey responses between May and June 2019 from 237 respondents, of whom 65% were aged over 45, and 73% were women. Respondents were asked questions such as “You have been feeling severely depressed, and having suicidal thoughts”, “You have what you think are headlice” and “You have been coughing up blood” and asked to indicate the most preferred and acceptable consultation source.

Chatbots and virtual voice assistants are increasingly used in primary care but, the researchers wrote, there is a lack of evidence for their feasibility and effectiveness.

The researchers suggested that further research could help establish a set of health topics most suitable for chatbot-led interventions. “Primary care services could consider chatbots as a signposting tool aiding health professional or improving doctor-patient communication for low severity conditions,” they wrote.

Dr Tom Nadarzynski, lead author of the study from the University of Westminster, said: “Many AI developers need to assess whether their AI-based healthcare tools such as symptoms checkers or risk calculators are acceptable interventions. Our research finds that patients value the opinion of health care professionals, therefore implementation of AI in health care may not be suitable in all cases, especially for serious illnesses.”