AI support can lead to more empathetic patient-provider relationships

admin
10 Min Read

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Key takeaways:

  • Practicing empathy can benefit patient-provider relationships while also improving health outcomes.
  • AI can be used to support clinical decision-making and give providers more time to spend with patients.

NEW ORLEANS —AI can be used not only to improve clinical decision making, but also to free up providers to spend more quality time with patients, according to two keynote speakers at the AACE annual meeting.

AI could transform the way providers practice medicine, according to presenters Helen Riess, MD, a part-time associate professor of psychiatry at Harvard Medical School and chief medical officer of Empathetics Inc.; and Dereck Paul, MD, co-founder and CEO of Glass Health.

Potential benefits of AI clinical decision making
Infographic content were derived from Riess H and Paul D. Keynote: The art & science of empathy and artificial intelligence accountability. Presented at: American Association of Clinical Endocrinology Annual Scientific and Clinical Conference; May 9-11, 2024; New Orleans.

Large language models have the ability to assess a patient based on information provided to them and can assist providers by listing possible solutions to a patient’s condition, composing a treatment plan and detailing how a provider can follow-up with that patient, according to Paul.

“This is not something that’s coming,” Paul said during the presentation. “This is something that is here.”

AI tools can be leveraged to reduce the amount of time a spent analyzing electronic medical records or other data. This time can then be used to connect with the patients in an empathetic way, according to Riess.

“Many of the problems that we have today with health care stem from the idea that we have to automate health care,” Riess said. “We have to make it digital, quick and easy to use so we can see more and more patients. But what got a bit left in the dust was the importance of the [patient-provider] relationship, which actually contains so much of the healing medicine.”

Importance of empathy

There are many misconceptions about empathy in health care, Riess said. Among those misconceptions are empathy does not lead to improved health outcomes. However, a systematic review and meta-analysis published in PLoS One in 2014 found empathetic interventions led to improvements for people with obesity, osteoarthritis, lower respiratory infection and asthma.

Another misconception is empathy cannot be taught. Riess and colleagues conducted a trial in 2012 where physicians were randomly assigned to either an empathy training group or a control group receiving no training. In the study, 57% of physicians in the training group had an improvement in patient-reported ratings vs. 32% of the control group (P = .04).

Riess cited workload, computer use, documentation and new technology as factors leading to detachment and burnout. She said providers need to shift the focus of their work away from data and back to the care of the patient.

“When we re-orient ourselves to the care of the patient, we get more meaning and satisfaction in our work,” Riess said.

Practicing self-empathy is a critical first step to empathetic care, according to Riess. She encouraged providers to invest in their mental and physical well-being, find purpose and meaning in life, become emotionally attuned with oneself and develop a strong cohesion with their community.

Empathetic care also needs to be promoted at the institutional level. Riess said institutions need to make their workplaces safe for providers, communicate with them in an accurate and caring fashion, build community cohesion and make mental health a priority.

Riess discussed the Stanford Model for Professional Fulfillment, which includes the three key tenets of personal resilience, culture of wellness and efficiency of practice. Commitment, listening and training from leadership can build that culture of wellness at institutions, according to Riess, and self-empathy, empathy training and education about boundaries can build personal and institutional resilience. For efficiency of practice, Riess said, AI has the ability to play an important role.

“We need training, and we need community building,” Riess said. “We need to come together again as empowered teams that have the power to change culture. And we need to put love back in medicine.”

AI as a support tool

AI will not be a replacement for providers, but will instead be a tool providers can use to improve health care in several ways, according to Paul. He discussed data published in 2023 that showed providers using large language models to make diagnostic decisions performed about as well as the large language models alone.

Paul emphasized the importance for providers to be involved in the process as large language models continue to evolve. Research suggests large language models may propagate racial-ethnic bias in medicine. Additionally, the models must use information and guidance that is deemed to be best practices by top experts and organizations in the medical field.

“We’ve seen some of the baseline power of generative AI,” Paul said. “Now, as a health care community, we have to take responsibility to make these models work for us so they can augment our abilities.”

Some of the ways the medical community can improve AI as a clinical decision support tool are by supervising AI as it would a medical student or medical assistant, fine-tuning the learning models’ datasets to improve performance, creating content libraries that are peer-reviewed and edited for models to use, and testing the AI by simulating scenarios to identify strengths and weaknesses.

“It’s more important than any other industry that you have health care providers involved in testing and optimizing these models,” Paul said.

Large language models are already able to act as a clinical support tool for providers. Paul said models can take details from an individual case and determine the most likely diagnoses and differential diagnoses. Models can also assess a patient’s symptoms and develop treatment plans as well as ways for the provider to monitor and follow-up with the patient. The model can also provide preventive measures and education that a clinician can then communicate to the patient.

Paul said AI clinical decision support could grow further in the next couple of years. AI may be able to listen in to a patient-provider conversation and draft a transcript and history of present illness based on the conversation. It could create and revise its diagnoses before and after an encounter with a patient as well. Multi-model AI may be able to assess medical imaging or improve imaging by using its own generated images and videos. AI may also be used to improve electronic medical record review or generate a discharge summary for a patient based strictly on their medical record.

The potential benefits of AI are numerous, according to Paul. He believes large language models can increase provider efficiency, improve provider wellness, increase academic time and improve patient outcomes.

References:

Kelley JM, et al. PLoS One. 2014;doi:10.1371/journal.pone.0094207.

McDuff D, et al. arXiv. 2023;doi:10.48550/arXiv.2312.00164.

Riess H, et al. J Gen Intern Med. 2012;doi:10.1007/s11606-012-2063-z.

Source link

Share This Article
error: Content is protected !!