4 essentials to develop a successful health care AI solution

Andis Robeznieks , Senior News Writer
Michael Abramoff, MD, PhD
Michael Abramoff, MD, PhD

While augmented intelligence (AI), often called artificial intelligence, holds great promise to streamline health care if done right, there are no shortcuts to its implementation, according to a physician who has been working in the field of health care AI for 30 years.

The IDx-DR diabetic retinopathy diagnostic exam solution created by physician-scientist and AMA member Michael Abramoff, MD, PhD, is the first autonomous AI system authorized by the Food and Drug Administration. It is autonomous in that a computer makes the diagnosis, not a human.

Digital health series

Discover best practices and resources to implement and optimize digital health solutions.

It was an eight-year journey to FDA approval for Dr. Abramoff, an ophthalmologist specializing in retinal disease who was disturbed by how long it often took for patients with diabetes to see an eye-care specialist for a diabetic retinopathy exam. He was also bothered by how specialists’ schedules are frequently crammed full of routine eye-exam visits that did not require their level of expertise.

“For other technologies, it may not take as long, but this is really, in a way, ground-shaking because it’s not a human but rather a computer making a medical decision,” Dr. Abramoff said. “Looking back, I don't think it could have been done differently, because we needed to start the right way.”



Dr. Abramoff is a professor of ophthalmology at the University of Iowa Carver College of Medicine and practices ophthalmology with University of Iowa Hospitals & Clinics (UIHC), an AMA Health System Program member. He spoke during a virtual meeting of the AMA Insight Network that covered how to get a health care AI program up and running, and how to use it properly.

The AMA Insight Network helps AMA Health System Program members gain early access to innovative ideas, get feedback from their peers, network, and learn about pilot opportunities. Learn more.

For those seeking to follow the same path, Dr. Abramoff recommended following these steps.

“We needed to start with ethical principles and a lot resulted—and it's still developing—from that,” he said, explaining that this means ensuring patient autonomy and equity, and mitigating, rather than exacerbating, any racial or ethnic bias that may find its way into the program.

“The more you have this ethical framework in place, the more you're able—or we were able, at least—to convince all stakeholders, patient organizations, regulators, payers like CMS [Centers for Medicare & Medicaid Services], private payers, physicians, nurses, technicians to all say, ‘yeah, this is something we trust,’” Dr. Abramoff said.

Related Coverage

For smart use of health care AI, start with the right questions

“Ask: Are you actually doing something that’s either improving patient outcomes or improving a population’s outcome?” Dr. Abramoff said. “If the AI cannot aid with that, why are we doing it?”

Otherwise, the technology becomes what Dr. Abramoff calls “glamour AI.” That’s when the technology seems exciting to implement, but has no real patient benefit.

“We should really focus on patients and populations and improving their health, removing health inequities, improving access, and improving quality of care,” he said.

Dr. Abramoff doesn’t use his solution in his own clinic. It’s built for use where the people with diabetes get their care, outside of eye clinics care. “It needs to fit into the primary care, endocrinology, and internal medicine workflow,” he said, adding that ordering, billing, and claims are all done automatically.

“It's all about workflow,” Dr. Abramoff said. “If it slows down the clinic, if it leads to extra clicks for the physician, then they will be reluctant to use it.

“Anything you can do to make it easier for those people who manage the patient’s diabetes so that it becomes easier to get the eye exam done, the better for the patient,” he added.

Related Coverage

7 tips for responsible use of health care AI

CMS and private payers will pay practices for using autonomous AI. “You can bill for it and get paid,” Dr. Abramoff said. “It really helps with adoption.”

When it comes to medical liability, “you’re liable for your medical decisions, and rightfully so,” Dr. Abramoff said. But, he explained, primary care doctors and endocrinologists using his system are essentially outsourcing the diabetic eye exam medical decision to the AI and therefore, should not be held liable for its performance—the AI creator should be. That is the position his company took early on, and he noted that it is now AMA policy.