Doctors are already using unregulated artificial intelligence tools such as note-taking virtual assistants and predictive software that helps them diagnose and treat diseases.
Government has slow-walked regulation of the fast-moving technology because the funding and staffing challenges facing agencies like the Food and Drug Administration in writing and enforcing rules are so vast. It’s unlikely they will catch up any time soon. That means the AI rollout in health care is becoming a high-stakes experiment in whether the private sector can help transform medicine safely without government watching.
The cart is so far ahead of the horse, it’s like how do we rein it back in without careening over the ravine?” said John Ayers, associate professor at the University of California San Diego.
Unlike medical devices or drugs, AI software changes. Rather than issuing a one-time approval, FDA wants to monitor artificial intelligence products over time, something it’s never done proactively.
President Joe Biden in October promised a coordinated and fast response from his agencies to ensure AI safety and efficacy. But regulators like the FDA don’t have the resources they need to preside over technology that, by definition, is constantly changing.
“We’d need another doubling of size and last I looked the taxpayer is not very interested in doing that,” FDA Commissioner Robert Califf said at a conference in January and then reiterated the point at a recent meeting of FDA stakeholders.
Califf was frank about the FDA’s challenges. Evaluating AI, because it is constantly learning and may perform differently depending on the venue, is a monumental task that doesn’t fit his agency’s existing paradigm. When the FDA approves drugs and medical devices, it doesn’t need to keep tabs on how they evolve.
And the problem for the FDA goes beyond adjusting its regulatory approach or hiring more staff. A new report from the Government Accountability Office, the watchdog arm of Congress, said the agency wants more power — to request AI performance data and to set guardrails for algorithms in more specific ways than its traditional risk assessment framework for drugs and medical devices allows, the GAO said.