6 min read

How healthcare organizations should train staff on AI use

How healthcare organizations should train staff on AI use

The U.S. Department of Health and Services' Artificial Intelligence (AI) Strategy notes the importance of ensuring "that all employees whose work could benefit from access to frontier models have access to, and appropriate training for, such tools." This commitment from the agency shows that AI literacy is no longer optional; it's now necessary for modern healthcare delivery.

The pace of AI adoption in healthcare has increased over the years. According to a TechTarget article on AI and HIPAA compliance, a 2025 survey from the American Medical Association found that 66% of physicians now use AI in their practices compared to just 38% in 2023. The increase makes staff training a necessity, as AI "is being incorporated into almost all aspects of a healthcare lifecycle," from clinical care and diagnostics to administrative tasks and research.

However, a study titled Healthcare workers' knowledge and attitudes regarding artificial intelligence adoption in healthcare: A cross-sectional study found that only 39 percent of healthcare workers demonstrated good overall knowledge about AI in healthcare, while 73 percent held positive attitudes toward its adoption.

Additional research from Healthcare workers' readiness for artificial intelligence and organizational change: a quantitative study in a university hospital by Hafize Boyacı and Selma Söyük confirmed this positive attitude that healthcare workers are prepared for the use of medical artificial intelligence in healthcare institutions and perceive organizational change positively. However, their study also found a low-level positive relationship between healthcare workers' level of readiness for medical artificial intelligence and their perception of openness to organizational change, suggesting that enthusiasm alone doesn't guarantee smooth implementation without proper preparation.

 

Start with the "Why" before the "How"

AI training programs should begin with helping staff understand the purpose behind these tools. The cross-sectional study found that 46 percent of healthcare workers expressed concerns about potential job displacement due to AI. Boyacı and Söyük's research echoes this concern, noting that healthcare workers need reassurance about AI's role in augmenting rather than replacing their work.

It is necessary to note that the impact of AI on healthcare efficiency is measurable. According to a McKinsey Health Institute report, up to 30 percent of nurses' tasks could be automated or delegated, freeing up time for more meaningful work. The report estimates that freeing up healthcare workers' time through AI and automation could create the equivalent of two million additional workers globally, a much needed contribution given the projected shortage of at least ten million healthcare workers by 2030.

In one study cited by the McKinsey Health Institute, AI-assisted clinical documentation reduced consultation length by 26 percent while maintaining patient interaction time and improving documentation accuracy. Another initiative helped primary-care physicians reduce up to 72% of the time spent on reviewing notes and determining proper billing codes.

The HHS AI Strategy recognizes this principle, aiming to "empower our workforce and enhance our staff's capacity to stay focused on measurably improving health and human services delivery."

 

Tailor training to different roles and skill levels

The cross-sectional study revealed that healthcare workers aged 18 to 35 demonstrated better knowledge and more positive attitudes toward AI compared to older colleagues. Boyací and Söyük's research adds further to these demographic considerations. Their study found that the level of readiness for medical artificial intelligence among healthcare workers was found to be high among males, doctors and internal sciences, while the perception of openness to organizational change was found to be high among postgraduate/doctoral graduates, surgical sciences, and nurses. These findings show the need for training programs that acknowledge and address different starting points and perspectives across the healthcare workforce.

The HHS AI Strategy takes this tailored approach seriously, committing to "Establish role-based training pathways (introductory to advanced) aligned to mission roles (e.g., clinician, regulator, analyst, grants manager)." Furthermore, the strategy acknowledges the spectrum of expertise needed by "Formalizing and disseminating AI-related training for those who are first interacting with generative AI to advanced practitioners building and refining bespoke models."

 

Note limitations and critical thinking

AI tools can produce errors and biases in their training data. Healthcare workers must understand that AI outputs are recommendations and that clinical judgment remains necessary. "AI-enabled health technologies" topped the list of the most significant technology hazards in the healthcare industry in the nonprofit Emergency Care Research Institute's (ECRI) "Top 10 Health Technology Hazards for 2025" report. This shows the need of training staff to understand AI's limitations.

Furthermore, the cross-sectional study found that fewer than half of healthcare workers understood ethical considerations associated with AI adoption, and only 44 percent grasped how to ensure AI benefits diverse patient populations. Training programs should include real-world examples of AI failures and near-misses to demonstrate why human oversight matters.

 

Incorporate hands-on practice and simulation

Training programs should include hands-on sessions where staff interact with AI tools in controlled environments before using them in patient care settings. The cross-sectional study found that healthcare workers who attended AI conferences or learned through research articles demonstrated higher knowledge levels and more positive attitudes compared to those who learned through social media or television news.

 

Demonstrate AI's clinical impact with evidence-based examples

The McKinsey Health Institute report documents several examples of AI effectiveness in real-world settings. In Malawi, computer-aided X-ray interpretation reduced the time to diagnose tuberculosis and HIV by 90 percent, from 11 days to one day. The automation generated accurate diagnoses in 91 percent of cases while reducing patient visits and freeing up valuable time for physicians to focus on more direct patient care.

These examples should be incorporated into training programs to help staff visualize AI's potential benefits. The cross-sectional study found that 62 percent of healthcare workers recognized AI's potential to improve patient outcomes, and 51 percent understood its role in early disease detection.

 

Address ethical considerations and patient communication

As the TechTarget article notes, "There are concerns about where data lives, who has access to it and how that data is being used." These concerns extend beyond technical security measures to questions about patient privacy and data governance. Training programs must equip staff to understand this, especially as healthcare organizations rely on cloud-based AI tools and share data with third-party vendors.

The TechTarget article further states, "AI systems are not a special scenario falling outside of these existing robust compliance obligations." The rules for notice, consent, and responsible use of data that apply to traditional healthcare operations apply equally to AI systems. Healthcare organizations should be "laser-focused on applying robust governance controls, whether data will be used to train AI models, ingested into existing AI systems or used in the delivery of healthcare services."

The cross-sectional study found that only 42 percent of healthcare workers understood the ethical considerations associated with AI adoption, and fewer than half grasped the security and privacy measures necessary for protecting patient data in AI applications. These gaps must be addressed through ethics training.

 

Measure competency and gather feedback

Organizations should assess whether training achieves its intended outcomes through competency evaluations and feedback mechanisms. This might include testing staff understanding of key concepts, observing how they interact with AI tools in practice, and tracking metrics like appropriate use rates and error identification.

Based on their research findings, Boyací and Söyük recommend that employees be made aware of the benefits of using artificial intelligence in healthcare institutions and that necessary training activities be planned.

 

Case study: The University of Florida's AI for Clinical Care Workshop

An example of AI training comes from the University of Florida's AI for Clinical Care (AICC) workshop, held in April 2024 as part of the NIH Bridge to Artificial Intelligence consortium. The workshop attracted 90 participants from 14 academic institutions across the United States, offering both beginner and advanced tracks to accommodate varying skill levels. The beginner track, designed for individuals without prior programming experience, featured hands-on tutorials using Jupyter notebooks with real-world clinical datasets. Participants worked through six interactive sessions covering Python programming, biomedical data analysis, machine learning for clinical care, and deep learning.

The workshop avoided overwhelming participants with complex mathematical foundations. Instead, it treated AI as a clinical tool or medical device, focusing on when, where, how, and why to use AI for specific healthcare applications. This practical approach resonated with clinicians who needed applicable skills rather than technical training.

Prior to the event, beginner track participants completed two self-paced online courses (Foundations of AI-Based Medicine and AI-Based Medicine Technical Expertise), with registration fees waived for workshop attendees. This pre-workshop foundation allowed the in-person sessions to dive deeper into practical applications.

Post-workshop surveys from 41 participants revealed statistically significant knowledge gains across all eleven learning objectives in both tracks. For the beginner track, all six learning objectives showed improvements, with the largest increase in participants' ability to identify important Python libraries for biomedical data science. The advanced track similarly demonstrated gains across all five objectives, with the highest improvement in building U-Net models to generate images from noise.

Self-assessed knowledge levels rose from "some ability" to "moderate to good ability" for beginner track participants, while advanced track participants progressed from "moderate ability" to "good ability." Approximately one-quarter of respondents reported that networking events resulted in follow-up plans after the conference, including continued mentorship and research collaborations.

The workshop's success also demonstrates the value of institutional partnerships. Collaboration between the University of Florida College of Medicine, NVIDIA, and the NIH Bridge2AI consortium brought together clinical expertise, technical knowledge, and funding resources. For healthcare organizations considering similar initiatives, the AICC workshop offers a replicable model; combine pre-workshop foundational courses with intensive hands-on practice, offer tiered tracks for different skill levels, leverage partnerships for expertise and resources, incorporate real-world clinical applications, and plan for sustained engagement beyond the initial training event.

 

FAQs

How should healthcare organizations budget for AI training programs?

Organizations should treat AI training as a long-term operational investment, similar to cybersecurity or compliance training.

 

How can smaller healthcare organizations with limited resources approach AI training?

Smaller organizations can rely on vendor-led education, shared learning networks, and low-cost online courses to build foundational AI skills.

 

What role should AI vendors play in staff training?

Vendors should provide transparent training on system limitations, updates, and real-world use cases, but organizations must retain independent oversight.

Subscribe to Paubox Weekly

Every Friday we bring you the most important news from Paubox. Our aim is to make you smarter, faster.