Patients want clarity about the use of AI in healthcare

BOSTON – The ability to leverage artificial intelligence to change the way healthcare professionals communicate with patients is not just about accuracy, transparency, fairness, and maintaining data models, but also about understanding how to address the challenges of personalization.

What patients want to know and when they want to know adds complexity, and the medical AI industry needs to consider both expected and unexpected patient perspectives, panelists at the HIMSS AI in Healthcare Forum on Thursday said.

Artificial intelligence has the power to make situational decisions and help doctors have a more human interaction with patients by removing the hassle of data entry, thus transforming the patient-doctor interaction.

“These incredible tools, which are evolving much more rapidly than the health care system, can, to some extent, allow us to think about how they’re embedded and really positioned to provide incredible opportunities to personalize the conversation and to advise and support people in making decisions that are relevant to them and what matters to them,” said Anne Snowdon, chief research officer at HIMSS, parent company of HIMSS. Healthcare IT News.

While the usefulness of AI technology is an important part of the trust discussion, transparency, choice, autonomy and mapping decision-making are crucial for patients.

“There’s a movement starting to redefine and rethink care in that light,” said panel moderator Snowdon, who has a doctorate in nursing.

Improved patient communication

Snowdon was joined by Alexandra Wright, patient advocate and director of research at HIMSS, Chesan Sarab, PhD, director of clinical innovation at Cornell Tech’s Health Tech Hub, Mark Polyak, president of analytics at IPSOS, and Dr. Lukas Kowalczyk, physician at Peak Gastroenterology, for a deeper discussion of what patients want from an AI-enabled healthcare experience.

“While the healthcare industry is still figuring out the challenges of artificial intelligence, AI can improve the quality of conversations and build trust,” said Sarab, who is also on the board of The Right Collective, a nonprofit organization that aims to advance the collective rights, benefits, and voice of patient communities in the health tech space.

Sarab said that in his work with the organization, he heard from a patient insight panel that one patient who thought he was communicating with a very helpful nurse at the clinic, named Jessica, through the patient portal expressed a loss of trust when he asked to meet the nurse in person at the clinic.

“She just said she wished they’d told her up front that it was a chatbot,” he said.

“Patients shouldn’t be deceived,” said Kowalczyk, a gastroenterologist and consultant to Denver-based digital health platform CliXa.

But if patients know that healthcare chatbots like Jessica aren’t real people, AI’s compassionate communication capabilities could improve a difficult situation for them.

“Compassion fatigue is a real thing in medicine,” Kowalczyk said. “Especially as you’re going through your day, it can be really hard. You have one or two patients and it makes it really hard to step into the next one.”

Large-scale language models are better at transforming and translating information and communicating patient concerns to clinicians, giving clinicians time to “take a breather” and regain their empathy, he said.

“Patients feel like the AI ​​is acting as their advocate, and it’s helping us understand them better as people.”

The Dynamics of Personalization

AI may not contribute to the patient’s vision of care, and in various scenarios such as predictive analytics, it may provide information that the patient does not want.

“Some patients will want more information, some will want less, or some will want less information during the in-person appointment but more documentation to review later,” Sarab said.

From a physician’s perspective, “it’s difficult to customize all the information, context and content for every patient.”

According to Polyak, there are three components to care: access to care, access to accurate information and the speed of information.

He noted that among the group of patients using ChatGPT, 16% were asking medical questions to reduce medical costs.

“[They] To reduce costs, we asked ChatGPT to provide different scenarios for how doctors should approach treatment based on the patient’s situation.”

“It wasn’t really what I was expecting, but it was basically a scenario generation that they printed out and brought to their appointment.”

Sense of control also varies from patient to patient.

For patients and their families facing a health crisis, “information really is power,” Wright said.

“When you’re in a situation like this, it often feels like you’re out of control,” she said.

“And when you don’t really understand your condition or what’s going on, it can make you feel like you have no control over what’s happening to you.”

When doctors are no longer in the room and patients have questions, they will look for information in search engines or on ChatGPT, she said.

Context also influences the information that patients want to control.

“When I first went to the hospital, would I have liked them to tell me what my chances of survival were? Probably not, because I don’t think it would have made things any better,” Wright said.

“But now that I think about it, if someone told me about my future risk of, say, getting cancer, would I want to know if there was anything I could do to prevent it? Probably.”

Snowden said the implications of the detailed discussions are a game changer for the use of AI in healthcare: “It’s about empowering people to make their own decisions, provide information with confidence, [discover] What is most meaningful to them?”

Andrea Fox is a senior editor at Healthcare IT News.
Email: afox@himss.org

Healthcare IT News is a publication of HIMSS Media.

#Patients #clarity #healthcare

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top