Skip to content
Question of the week

How are clinicians discussing the use of AI with their patients?

Healthcasts Team
Healthcasts Team |
AI is showing up in the clinic more often, and that raises a lot of questions for both patients and providers. How do clinicians explain what AI is doing, what it can and cannot do, and how it fits into the care plan?

 

In a recent post on Healthcasts, clinicians shared how telehealth fits into their day-to-day practice today, if at all. From chronic care follow-ups and lab reviews to improving access for patients who might otherwise miss an appointment, their responses highlight where virtual visits work well, and where they don’t.

What AI tools, if any, do you use in your practice? Log in or sign up to read the full post, share your approach to telehealth, and see the Consensus. 

 

HC-Icon-Search-Coral-RoseHC-Icon-Search-Coral-RoseHC-Icon-Search-Coral-Rose HC-Icon-Search-Coral-RoseQuestion of the week

1. When AI-based decision support tools are used in patient care, what communication strategies can clinicians use to explain the role and limitations of AI in a clear, patient-centered manner that supports trust, shared decision-making, and ethical use?
 

 

 HC-Icon-Speech-Bubbles-2-Coral-RoseComments

Key takeaways about discussing the use of AI with patients:   

  • AI supports care, but clinicians make the decisions
    Clinicians consistently frame AI as a tool that assists analysis, screening, or summarization—not one that replaces clinical judgment. Diagnosis and treatment decisions remain firmly human-led.
  • Simple language helps patients understand AI’s limits
    Clear analogies and plain explanations help patients see AI as helpful but imperfect, especially as more patients arrive with AI-generated assumptions about their health.
  • Trust comes from transparency and human oversight
    When AI influences care, clinicians emphasize openness, safeguards, and verification, reinforcing that physicians remain accountable for every decision.

Dermatology

"AI helps me analyze your skin condition more accurately, kind of like a second opinion. However, it’s just a tool that I use, and I still make all the final decisions about treatment. AI can’t catch everything, so I always rely on my experience and judgment too. Also, all personal information is kept safe and private."

Internal Medicine

"I explain to my patients that AI is just a supportive summary tool - it cannot make diagnoses but may suggest them. AI is not a physician and does not take the place of physicians when it comes to making diagnoses or making treatment plans. I am still doing the 'hard work' with AI only supporting."

Family Practice

"I would inform the patient that AI can be used as a supportive tool, but diagnosing requires physically seeing a patient as well and running different labs, where a physician in person becomes necessary. I would also explain that not all diagnoses present the same in every patient, which is where AI cannot help if there is no information on 'Dr. Google.'"

Dermatology

"I think AI used for scribing will improve the quality and accuracy of the note. If I were using AI for decision-making, I would be embarrassed to tell the patient. Using it as an assistant in generating a differential diagnosis is totally fine to use, and by no means would you be obligated to tell the patient. Doing so would make them lose faith in you."

Hospitalist

"AI may be used for helping document the visit, and help with more administrative tasks. AI tools have also commonly been integrated into point-of-care tools that clinicians traditionally used for medical knowledge and decision support. The human is always verifying the AI output, and the AI is not independently/autonomously signing tasks."

 Ophthalmology/Optometrists 

"In eye care, most AI tools to date focus on disease screening. I explain to patients that it's another tool to help doctors screen for and detect eye disease, which doctors can then take a further look at to evaluate its findings."

Internal Medicine

"A lot of patients are using ChatGPT to self-diagnose, which isn't ideal, although better than just googling. However, this can cause more anxiety than help. Most of the time, they are convinced that they have a certain condition. For clinicians, AI tools are available, but also not perfect. We still need to utilize our own judgment and training to formulate the correct plan of treatment."

Oncology/Hematology

"We do need to explain to patients the utility of AI in clinical decision-making. When AI-based decision tools are available, we need to be upfront and honest with the patient that these tools are being used in their clinical diagnostic and management decisions.

I would highlight that there are limitations to this as well. I would also highlight that as a clinician, we are the gatekeepers andneed to make sure that all the data being processed and all the clinical decisions being made regarding the patient are consistent with current clinical guidelines, current research data, and standards of care."

Pulmonology

"I explain in clear language that AI only assists me, that I make joint clinical decisions with the patient with my knowledge and experience. I use the analogy that I cannot always trust Google Maps to get me to my destination safely. Many safeguards and clinical knowledge are critical to reach a wise decision that is clinically appropriate."

Internal Medicine

"I personally am not using AI in any note writing or decision making. If I were to use AI for note writing, I would disclose to patients that it is a tool that is used to simplify and streamline our communications while also possibly reducing missed pieces of important information. I have heard most patients are ok with that, but it should be disclosed."

How do you approach conversations about AI with your patients? Share your perspective and read all of the comments on the post on Healthcasts. 

 

Share this post