Banner

Publication

Article

Medical Economics Journal

Medical Economics July 2023
Volume100
Issue 7

AI Special Report: What doctors need to know about ChatGPT and other AI tools

Listen to an audio version of this article (6 minutes)

For many physicians, the rise of artificial intelligence (AI) tools, such as ChatGPT, may feel unnerving and even threatening, particularly as patients begin to use these tools to get answers about their health. Although every tool has its dangers and AI is, in many ways, in the early days of its capabilities, AI tools are here to stay, according to experts. Physicians who educate themselves, and learn to use them, are likely to stay ahead of the curve and even enhance their practices.

ChatGPT and AI tools in medicine: ©ipopba - stock.adobe.com

ChatGPT and AI tools in medicine: ©ipopba - stock.adobe.com

“The key thing to remember is that AI is a very powerful tool, and like a hammer that can be used to hit people, it can also be used to build a house,” Matt Hollingsworth, CEO of Carta Healthcare, which uses AI to streamline data collection, says. “The fact that people can run around smacking people upside the head with a hammer is not a reason to fear the hammer per se, but a reason to fear specific uses of the hammer. The same is true of AI.”

Hollingsworth says that as long as a user is deeply educated on what they are using, it is not dangerous. The more important question, he suggests, that physicians should be asking is, “‘Am I getting value out of this?’”

What is ChatGPT, and should physicians use it?

Currently the AI known as ChatGPT, a large language model developed by OpenAI, is all the buzz, since a free version of it recently became available for layperson use. In simple terms, Hollingsworth describes ChatGPT as “an extremely fancy autocomplete.” In essence, the AI is a predictive language model that mimics language patterns based on what it has seen.

In ChatGPT’s case, it has seen almost every piece of written information that exists on the internet, including books, up to 2021, according to Adrian Zidaritz, a data scientist and the founder of Institute for a Stronger Democracy Through Artificial Intelligence. Therefore, it can respond to user queries with answers that sound humanlike, as though it is thinking, but it is not actually utilizing any sort of logic nor does it have the ability to fact-check itself.

“What physicians need to be careful with is that ChatGPT tends to make up information at this point,” says Victor Cortes, founder and CEO of Verso, a software as a service (SaaS) startup that helps medical professionals manage appointments, payments and patients, says. “There have been cases where you ask it for some information and it provides a link to a paper title, but it’s not actually real.” ChatGPT is trained to predict associations between word and sentence patterns, but has no way to ensure accuracy.

ChatGPT by itself also cannot deliver any quantitative results because it is not programmed to do math or make non-language-based associations. But it could be paired with other kinds of algorithms, such as Mathematica, an AI tool used by scientists in academia, and likely will be down the road for more complex uses, Zidaritz says.

However, ChatGPT can be useful to physicians in a couple of ways even now in its early days, according to Hollingsworth. “ChatGPT is already useful broadly as a writing tool. Physicians have to spend a lot of time typing correspondence with patients or insurance companies.” Physicians could ask ChatGPT to aggregate basic info and then make a pass for accuracy and specificity, saving time.

Additionally, it can help to translate more technical medical jargon to patients, Zidaritz says. “We all complain that doctors give us explanations or write in our medical records in medical speak that’s very technical,” he says. “So we could ask ChatGPT to translate this in a human, more understandable way so (patients) know what the doctor is saying,” he explains.

It could also act like a medical assistant or scribe by recording conversations between physician and patient and then summarizing it into a report, Zidaritz says.

And it could be used as a kind of virtual intern, Cortes says. “You know, if you need some inspiration, or additional research, or a brainstorm to think of potential solutions, cases or diagnoses. ChatGPT might be a good alternative for that.”

What’s important to note is that patients will be using it to research health information and even self-diagnose, which is where physicians need to be vigilant, because ChatGPT gets so many things wrong. “Patients are already using it,” Zidaritz says. “And they aren’t trained to distinguish between (correct and incorrect information). So that could create some tension between the patient and the doctor.”

Other AI tools and their uses

AI tools are also useful for anything operations related, Hollingsworth says. “So inventory management, patient scheduling nurse scheduling, vendor management, contract management, etc.”

He feels that utilizing AI tools properly could even halve administrative overhead. “Everything where there’s a nurse sitting there poking around in PeopleSoft or an inventory management system, there’s an AI tool to make that substantially better,” he says.

Although there are some uses of AI already in diagnostics, such as in laboratories to detect cancer in patient tissue samples, this area is a work in progress, but one that is promising. “For obvious reasons, these AI diagnostic tools need to go through FDA approval and even clinical trials,” Hollingsworth says.

AI tools will also be able to help with insurance claims, Zidaritz explains. “It’s going to be taking over that whole administrative claim recording, in making claims and adjudicating claims. It will be able to spot mistakes.”

AI will not replace doctors

Although AI tools will keep getting more sophisticated and transform some health care practices, they will not replace doctors for many reasons.

The obvious reason, according to Cortes, is that “the warmth of the doctor-patient relationship can’t be replaced by AI. Health care is so humane that AI is going to be an aid, but it’s not going to be a replacement of what doctors do.”

Furthermore, according to Hollingsworth, “most AI needs a well-trained babysitter who’s an expert in something and can guarantee the quality of what’s coming out of the tool because there’s a high likelihood that it’s going to do something wrong, and that person needs to be trained to prevent it.”

For physicians who are unsure about these tools, Pau Rué, vice president of artificial intelligence at Infermedica, an AI-based medical guidance platform,suggests starting to become familiar with them. “My general recommendation would be for doctors to get familiar with AI tools. I think there is an AI revolution and AI is here to stay.” Although he does warn that many of these tools are not very mature or accurate, they are going to get better and better.

That said, he cautions that these tools need clinical validation and regulatory approval to build trust in both physicians and patients, because there is potential for legal concern. “Health care providers need to investigate the outputs of these models, particularly in the case of adverse incidents or complaints.”

Hollingsworth recommends that physicians be discerning in the tools they end up using, and to seek solutions to their problems versus just the hot new AI tool. “Shop for value, not for AI. It is my firm belief that 99% of the value that is going to be generated by AI is going to be delivered via a services layer of experts who know how to use it.”

However, Cortes does feel that physicians who engage with AI tools will be better prepared. “I think that people that start taking advantage of these tools are going to be way more prepared for the new economy and the new ways things will be done.”

Related Videos
Dermasensor
Kyle Zebley headshot
Kyle Zebley headshot
Kyle Zebley headshot