Digital

Implementing health AI? Why you need a diverse working group

Before committing to a technology, health care organizations need the right voices to shape the best tool possible.

By
Tanya Albert Henry Contributing News Writer
| 5 Min Read

AMA News Wire

Implementing health AI? Why you need a diverse working group

Jul 31, 2025

The more the merrier, the saying goes. In the case of health care organizations that are implementing augmented intelligence (AI)—commonly called artificial intelligence—it is indeed crucial to form robust working groups that include people who represent a broad group of departments.

Physicians, nurses and other clinical staff, as well as representatives from the operational, financial, legal, compliance, technology, data science, pharmacy and patient experience branches of the organization need to be part of the decisions being made when implementing AI. The AMA defines AI as augmented intelligence to emphasize that the technology’s role is to help health care professionals—not replace them.

You are why we fight

The AMA is your powerful ally, focused on addressing the issues important to you, so you can focus on what matters most—patients.

“Taking the time to slow down and set up an appropriate process to be able to vet these tools and to make them meaningful from a clinical perspective is critical so that you can set up your practice or organization for success in the future,” said Margaret Lozovatsky, MD, who is chief medical information officer and vice president of digital health innovations at the AMA. 

With two-thirds of physicians surveyed reporting that they use health care AI in their practice, the AMA STEPS Forward®Governance for Augmented Intelligence” toolkit—developed in collaboration with Manatt Health—is a comprehensive eight-step guide for health care systems to establish a governance framework to implement, manage and scale AI solutions.

Forming a working group to detail priorities, processes and policies is the second step of the process. The seven other foundational pillars of responsible AI adoption are:

  • Establishing executive accountability and structure.
  • Assessing current policies.
  • Developing AI policies.
  • Defining project intake, vendor evaluation and assessment processes.
  • Updating standard planning and implementation processes.
  • Establishing an oversight and monitoring process.
  • Supporting AI organizational readiness.

A recent AMA webinar (available now on demand) explores in-depth how to establish an AI governance framework. 

From AI implementation to EHR adoption and usability, the AMA is fighting to make technology work for physicians, ensuring that it is an asset to doctors—not a burden.

AMA membership = Great value for physicians

  • Thousands of free CME opportunities to fulfill state requirements
  • A powerful voice fighting for you during uncertain times
  • Research, resources, events and more from the largest physician organization

Why a working group matters

Decisions about AI technology need to be made as close to patient care as possible, Dr. Lozovatsky said in an interview.

“And what I mean by that is that every clinician and every administrative user of these tools has different needs, so it’s important to involve those stakeholders in the conversation to understand what the needs are and to ensure that technology is actually solving problems rather than creating them,” she said. “I’m a pediatrician. My needs are different than a cardiologist or a dermatologist. That’s why it’s important to have the right voices in the conversation.”

But it’s not just meeting the physicians’ needs, Dr. Lozovatsky said. AI tools can affect work for others in the organization, too. For example, when the physician puts in an order, a nurse may carry out the order and as AI tools are developed there needs to be an understanding of how the technology impacts that nurse.

Some things that the working group may be responsible for—and why it is important to have so many different voices in the room—include:

  • Defining and assessing the organization’s current use of AI tools, for example medical imaging analysis.
  • Setting AI priorities and ensuring they align with the organization’s strategy and available resources.
  • Detailing AI policies and updating, intake, implementation, evaluation and compliance and oversight processes.
  • Engaging stakeholders and managing enterprise communication.

Stay up to date on AI

Follow the latest news on AI and its applications and effects for health care—delivered to your inbox.

Health care AI subscribe

And some more specific questions that working members need to consider include:

  • What are the highest priority use cases for our organization?
  • What are the primary risks we aim to mitigate?
  • What does success look like in our AI efforts five years from now?
  • How will AI generate value for the organization?
  • Are we going to pursue international development, co-development and/or off-the-shelf- AI tools?
  • What is our approach to integration and workflow design?
  • Can the organization use certain AI tools without a standard governance review? If so, what are the thresholds?
  • How will the organization address the use of free, publicly available AI tools?

Partnership is key

Many physicians have stories of how newly implemented technology negatively impacted their workflow or wasn’t designed to do everything it could be doing. Creating a robust working group aims to prevent those experiences.

“You need a technology expert to help paint the picture of what’s possible and you need the clinicians to explain what problems we’re trying to solve. That partnership is where the magic happens,” Dr. Lozovatsky said.

In addition to fighting on the legislative front to help ensure that technology is an asset to physicians and not a burden, the AMA has developed advocacy principles (PDF) that address the development, deployment and use of health care AI, with particular emphasis on:

  • Health care AI oversight.
  • When and what to disclose to advance AI transparency.
  • Generative AI policies and governance.
  • Physician liability for use of AI-enabled technologies.
  • AI data privacy and cybersecurity.
  • Payer use of AI and automated decision-making systems.

Find out how participants in the AMA Health System Member Program are using AI to make meaningful change. And learn more with the AMA about the emerging landscape of health care AI

Also, explore how to apply AI to transform health care with the “AMA ChangeMedEd® Artificial Intelligence in Health Care Series.”

Making technology work for physicians

 

FEATURED STORIES

Pharmacist speaks with customer

Physician-led care is best prescription for health of nation

| 5 Min Read
Reviewing data on a laptop

Turning data into action to strengthen physician well-being

| 7 Min Read
Doctor raising hand to ask a question in a seminar

Building physician leaders who guide with heart and skill

| 7 Min Read
Hand signing a contract

What doctors wish patients knew about end-of-life care planning

| 6 Min Read