BOSTON – The ability to change how healthcare providers communicate with patients with artificial intelligence isn’t just about accuracy, transparency, fairness and data model maintenance, it’s figuring out how to meet personalization challenges.
What patients want to know and when adds a larger degree of complexity – one that challenges the healthcare AI industry to consider both expected and unexpected patient points of view, according to panelists Thursday at the HIMSS AI in Healthcare Forum.
With the power to contextualize, and allow clinicians freedom from data entry to be more human in their interactions with patients, artificial intelligence can transform patient-physician interactions.
“To some degree, these impressive tools that are evolving far more rapidly than a health system can even contemplate how they get embedded, really position or provide a tremendous opportunity to personalize that dialog, what matters to that person, and advise and support them to make the decisions that are relevant to them,” said Anne Snowdon, chief research officer at HIMSS, the parent company of Healthcare IT News.
While the utility of AI technologies is an important part of the conversation about trust, mapping transparency, choice, autonomy and decision-making are critically important to patients.
“From that perspective, it’s starting to redefine and rethink care,” Snowdon, the panel’s moderator, said. Snowdon holds a doctorate in nursing.
Improving patient communications
Snowdon was joined by Alexandra Wright, patient advocate and director of research at HIMSS, Dr. Chethan Sarabu, director of clinical innovation at Health Tech Hub of Cornell Tech, Mark Polyak, president of analytics at IPSOS, and Dr. Lukasz Kowalczyk, a physician at Peak Gastroenterology Associates, for a deeper conversation about what patients want from an AI-enabled healthcare experience.
“While healthcare is still figuring out the challenges of artificial intelligence hallucinations, AI can elevate conversations and build more trust,” said Sarabu, also a board member of the Light Collective, a no-profit organization that seeks to advance the collective rights, interests and voices of patient communities in health tech.
In his work with the collective, Sarabu said he heard during a patient-insight panel that one patient, who thought she was communicating with a very helpful nurse named Jessica at her clinic over the patient portal, expressed a lost sense of trust when she asked for the nurse in person at her doctor’s office.
“She just said she wished they had told her that it was a chatbot beforehand,” he said.
“You shouldn’t catfish your patients,” quipped Kowalczyk, a practicing GI specialist and an advisor at Denver-based Cliexa, a digital health platform.
But if a patient knows healthcare chatbots like Jessica are not real people, difficult circumstances for a patient could be made better with an AI’s ability to communicate compassionately.
“Compassion fatigue is a real thing in healthcare,” Kowalczyk said. “It gets very hard sometimes, especially when you’re going through a day, and it takes one or two patients to really make it hard to step up for that next one.”
Large language models excel at transforming and translating information and describing patient concerns to clinicians, giving them “a moment to take a breath” and regain empathy, he said.
“I think that those are the opportunities where patients feel that the AI is acting as an advocate for them, and it’s helping me understand who they are as a person better.”
The dynamics of personalization
AI may not add to the patient’s vision of care. In different scenarios – predictive analytics, for example – it may offer information that patients do not want.
“Maybe some patients want more information, some may want less, or someone may want less information in an in-person visit, but they want more material to review afterward,” Sarabu said.
From a doctor’s perspective, “It’s hard to really personalize all the information and context and content for every single patient.”
According to Polyak, there are three elements of care – access to care, access to accurate information and speed of information.
He noted that, among a panel of patients using ChatGPT, 16% were asking healthcare questions to reduce their healthcare costs.
“[They] asked ChatGPT to provide them with different scenarios of how our physicians should approach their care based on what they had – in order to lower the cost.”
“That was not something that I really expected, but it was mostly basically scenario generation that they would print out and bring” to appointments.
Sense of control also varies among patients.
For patients and their families facing a health crisis, “information is really power,” said Wright.
“Often, when you’re in these situations, it can feel very out of control,” she said.
“And if you don’t quite understand your condition or don’t quite understand what’s going on, it can really feel like you don’t have control over what’s happening to you.”
When the doctor is no longer in the room, and patients have questions, they are going to turn to search engines and ChatGPT for information, she said.
Context also plays a factor in the information patients want to control.
“When I was first there at the hospital, would I have wanted them to tell me my risk chance of survival? Probably not, because I don’t think it would have helped the situation,” Wright said.
“But then thinking now, if somebody was going to tell me about my risk of, let’s say, a future cancer, would I want to know if there’s something I could do to prevent that? Probably.”
What the granular discussion suggested, said Snowdon, turns healthcare’s use of AI on its head: “How do we help people help themselves make those decisions, inform with a sense of confidence and [discover] what’s most meaningful to them?”
Andrea Fox is senior editor of Healthcare IT News.
Email: afox@himss.org
Healthcare IT News is a HIMSS Media publication.