Banner

News

Article

Can Chat GPT Plus prove its overall effectiveness in health care settings?

Author(s):

Fact checked by:

Key Takeaways

  • AI integration in healthcare requires physician training to maximize effectiveness, as shown by a study on Chat GPT Plus.
  • Chat GPT Plus did not significantly improve diagnostic accuracy compared to traditional methods across three hospitals.
SHOW MORE

A study suggests that if health care professionals are better trained with AI than its potential to make care more effective could be in the near future.

Can ChatGPT be useful for physicians?: ©Rokas - stock.adobe.com

Can ChatGPT be useful for physicians?: ©Rokas - stock.adobe.com

While many in health care have had breakthroughs with the integration of artificial intelligence, it doesn’t always mean that the technology is as effective when physicians are not fully trained to use it. In a recent study conducted by Andrew S. Parsons and colleagues from University of Virginia Health, the researchers wanted to determine how technology such as Chat GPT Plus could be used more effectively to prevent this lack of training. The study was first reported on JAMA Network Open.

His team found that using Chat GPT Plus did not significantly improve the accuracy of doctors’ diagnoses when compared to the use of traditional resources when studied across three different hospitals, including UVA Health, Stanford and Harvard’s Beth Israel Deaconess Medical Center.

During the study, Parsons and his colleagues had 50 physicians in family medicine, internal medicine, and emergency medicine randomly assigned to use Chat GPT Plus to diagnose complex cases. 25 of the physicians were randomly assigned, while the other half relied on more traditional and conventional methods such as UpToDate and Google. After studying both groups, the researchers compared their resulting diagnoses, finding that accuracy across the two groups was similar.

However, Chat GPT Plus on its own outperformed both groups, highlighting that the technology has the potential to keep improving patient care. Yet, it also emphasizes that physicians need more training and experience with the technology to be more skilled at using it in real-life patient situations.

“Our study shows that AI alone can be an effective and powerful tool for diagnosis,” Parsons said in a statement. “We were surprised to find that adding a human physician to the mix actually reduced diagnostic accuracy though improved efficiency. These results likely mean that we need formal training in how best to use AI.”

Parsons and his team suggest that Chat GPT Plus is better for augmentation, and physicians should not be replaced by the technology. They also urge hospitals and health organizations to purchase predefined prompts to implement in clinical workflow and documentation.

“As AI becomes more embedded in health care, it's essential to understand how we can leverage these tools to improve patient care and the physician experience,” Parsons said. “This study suggests there is much work to be done in terms of optimizing our partnership with AI in the clinical environment.”

Related Videos
Dermasensor
Kyle Zebley headshot
Kyle Zebley headshot
Kyle Zebley headshot
Michael J. Barry, MD