Skip to content
Question of the week

How do practitioners really feel about involving AI in clinical decisions?

Alex Sixt
Alex Sixt |

As the use of AI in clinical decision-making becomes increasingly prevalent, healthcare practitioners across specialties are weighing its benefits and challenges. From streamlining documentation to aiding in complex diagnoses, AI offers promising support, but not without concerns around privacy, accuracy, and appropriate use. 

Below, practitioners on Healthcasts discuss whether AI should be integrated into clinical decisions and what guardrails should be put in place. 

Read the consults to discover our community's perspectives on AI in clinical decision-making, then log in or sign up to see the consensus and share your own. 

 

HC-Icon-Search-Coral-RoseHC-Icon-Search-Coral-RoseHC-Icon-Search-Coral-Rose HC-Icon-Search-Coral-RoseQuestion of the week

1. Do you support the integration of AI tools in routine clinical decision-making?
2. What limitations or guardrails should be considered for safe implementation?

 

 HC-Icon-Speech-Bubbles-2-Coral-RoseConsults

Key takeaways about integrating AI into clinical decision-making:  

  • AI is a helpful tool, not a replacement
    Most practitioners see AI as a valuable support for diagnosis and documentation, but emphasize that clinical judgment must remain central. AI should assist, not replace,  the provider’s expertise. 

  •  Guardrails are crucial
    Concerns around privacy, bias, and accuracy highlight the need for strong safeguards. Responsible use and verification are essential to ensure safe and ethical AI integration.
  • One size doesn’t fit all
    AI’s role varies by specialty and experience level. While some welcome its help with documentation, others caution against overuse by residents or those unfamiliar with clinical nuance.



Hospitalist

"1. Yes, we should be adept in using AI as a resource and tool. It is inevitable, so the sooner we are able to learn it and gain agility, the better. It is the best way to be ready, as it seems it may be more and more present.

2. HIPAA compliance, making sure that all use it responsibly and that clinical judgement and in-person interaction with the patient will always be best."

Oncology/Hematology

"One highly potential use of AI is to identify rare diagnoses or unusual manifestations in particularly complex cases. In the past, doctors relied on their memory and experience. After collecting patients' symptoms, signs, laboratory tests, and radiological findings, they would compare them with the knowledge and cases in their memory to make their own diagnosis.

Therefore, the ability to optimize diagnosis through memory and experience has become a key indicator to distinguish the clinical level of doctors. In modern times, doctors often use Internet search engines, online resources, or differential diagnosis generators to help explain clinical manifestations that are difficult to explain."

Internal Medicine

"At this point in time, although AI gives a good summary of data available, it does not distinguish between authentic, clinically proven, and other data out there. Therefore, careful review and personal experience should still be used before deciding to implement the results in decision-making."

Psychiatry

"I think the likelihood of AI being part of the documentation of clinical presentation of psychiatric disorders is inevitable. It is important for clinicians to reread their clinical records to ensure the data and conclusions are accurately represented in their clinical histories. Though suggested diagnoses may be helpful as one is formulating clinical impressions, the provider must be careful to give weight to the primary diagnosis. The treatment recommendations must then predictably flow from this specific diagnostic impression."

Internal Medicine

"We have AI capability in EMR, but I am a bit concerned about patient privacy from third-party vendors, such as the AI provider. I also find that the embedded AI causes my computer to slow down tremendously and kill the battery life. I will wait to implement AI."

Internal Medicine

"It will have to be integrated someday. Sooner the better, probably. Intern residents should definitely not be using it actively. I feel like it hinders the learning process. I think it should be reserved for seniors at this time. The other issue is APPs using it and not understanding the answer that is populated to them. There are always nuances that need a higher level of understanding, and they may not realize this."

Psychiatry

"I think if these are used responsibly and are fully secured and HIPAA compliant, I think AI could potentially offer quite a bit to alleviate the paperwork and documentation burden. Our practice, which uses EPIC, is gradually rolling out AI functionalities to some early adopters to test them out. The length of our psychiatry documentation has posed some difficulties, but I am looking forward to having AI summarize the recent notes, assist with documenting lengthier hospital courses, and synthesize the conversation into meaningful daily progress notes. I'm less interested in using AI diagnostically, but it is an added bonus."

Oncology/Hematology

"Yes, I support the use of AI for assistance in clinical decision making. It is a great tool to help quicken the clinical decision-making process, but it needs to be adequately regulated to ensure no bias in the provided information. The final decision must remain with the clinician, but using AI as a resource/tool can help lessen the burden of the clinician and speed the process of research and aggregation of information."

Internal Medicine

"I do feel that, as in all uses of AI, we are getting lazy and dependent on these tools. It’s becoming too easy to just ask AI rather than doing research on your own and networking with other providers to work up a challenging patient case. However, I do feel that it definitely can help take some of the documentation burden off of clinicians."

Pulmonolgy

"1. Yes, it can bring new insight into the differential diagnosis of complicated cases
2. Always verify... I've seen "hallucinations" using AI in other areas, so always double-check!"

Continue reviewing our community's perspectives on integrating AI into clinical decision-making on Healthcasts, then share your own.

 

Share this post