The Food and Drug Administration on Tuesday published a list of artificial intelligence tools that should be regulated as medical devices, in some cases appearing to expand its oversight of previously unregulated software products.
In a new final guidance for industry, the agency specified that tools designed to warn caregivers of sepsis, a life-threatening complication of infection, should come under regulatory review. Health software vendors have been selling tools designed to flag the condition for years without obtaining clearance from the FDA.
Sepsis, which kills more than 200,000 people in the U.S. annually, is particularly difficult to detect. Several companies have developed AI tools to predict which patients are most likely to develop the condition, in a bid to help hospitals speed the delivery of antibiotics and save more lives.
But the tools do not always work as advertised. STAT published multiple investigations detailing the shortcomings of a widely-used tool developed by Epic Systems, the nation’s largest vendor of electronic health records. The investigations found that the tool frequently delivered false alarms and failed to catch the condition in advance, distracting caregivers working in time-sensitive situations. Epic’s sepsis alert system is used by more than 180 customers in the U.S. and Canada.
The FDA has traditionally steered clear of regulating software tools embedded in electronic health records, a domain seen as outside the scope of regulation because the software was mainly used as a record-keeping system that presented minimal risks to patients.
But the increasing sophistication of the products used within EHRs, and the growing role they play in advising providers on the treatment of serious and life-threatening conditions, have generated increasing calls for the FDA to take a closer look at those products.
“EHR vendors have to have oversight, in terms of how they build these algorithms and how they check them for bias,” said Leo Celi, a biostatistician at Harvard University who recently published a paper calling for stepped-up regulation.
Celi added that even the new guidance, which listed 34 separate types of products the FDA thinks should be regulated, does not create clarity because of the fine-grained distinctions between product categories and room for interpretation of the FDA’s language.
“There needs to be more public discourse and discussion between all the stakeholders,” he said. “They come up with this (guidance) and it’s not very clear where the line is between software as a medical device and a non-device.”
The new guidance is non-binding and does not necessarily mean that the FDA will soon begin to regulate sepsis tools and other products flagged as devices in the document. It is meant to clarify regulatory boundaries described in the federal 21st Century Cures Act of 2016, which included carve outs for technology products that lawmakers wanted to exclude from FDA review.
But the carve outs turn on definitions that are difficult to parse and may apply unevenly to a new generation of AI products flooding into the market. In addition to sepsis-related products, the guidance also indicates the FDA believes it should be reviewing products that predict heart failure hospitalizations, as well as those designed to identify signs of patient deterioration or review medical information to identify patients who might be addicted to opioids.