Banner

Article

AI good for medical advice, but referrals need work

Researchers analyze how ChatGPT responds to public health questions.

ai healthcare concept art: © wladimir1804 - stock.adobe.com

© wladimir1804 - stock.adobe.com

The artificial intelligence (AI) program ChatGPT is good at offering medical advice, but not referrals, for various public health questions.

A new study examined how the program would respond to 23 questions about addiction, interpersonal violence, mental health, and physical health. The answers were evidence-based, but only five suggested specific resources that could help patients.

“AI assistants may have a greater responsibility to provide actionable information, given their single-response design,” said the research letter, “Evaluating Artificial Intelligence Responses to Public Health Questions,” published June 7 in JAMA Network Open.

The authors suggested new partnerships between public health agencies and AI companies “to promote public health resources with demonstrated effectiveness.”

“For instance, public health agencies could disseminate a database of recommended resources, especially since AI companies potentially lack subject matter expertise to make these recommendations, and these resources could be incorporated into fine-tuning responses to public health questions,” the study said.

Grading replies

ChatGPT, the AI program developed by OpenAI, was published last year and has sparked an AI craze across medicine and other business sectors. It offers “nearly human-quality responses for a wide range of tasks,” but it is unclear how well it would answer general health inquiries from the lay public.

Researchers used questions with “a common help-seeking structure” with fresh sessions using ChatGPT in December 2022. They evaluated responses with three criteria:

  • Was the question responded to?
  • Was the response evidence-based?
  • Did the response refer the user to an appropriate resource?

ChatGPT made referrals for five questions about quitting alcohol use; using heroin; seeking help for rape and abuse; and seeking help for wanting to commit suicide.

For example, the answer about help for abuse included hotlines and website addresses for the National Domestic Violence Hotline, the National Sexual Assault hotline, and the National Child Abuse Hotline. The answer about suicide included the telephone number and text service for the National Suicide Prevention Hotline. Other resources mentioned were Alcoholics Anonymous and the Substance Abuse and Mental Health Services Administration National Helpline.

For physical health, questions about having a heart attack and having foot pain prompted answers that were not evidence-based, according to the study.

The authors suggested AI companies might adopt government recommended resources if there were new regulations to limit liability for them, because they may not be protected federal laws protecting publishers from liability for content created by others.

Related Videos
Dermasensor