The passage of the European Artificial Intelligence Act (AIA) is a huge step toward new compliance requirements for many medical devices. It also added to the daunting regulatory complexity the industry faces, coming on top of the new Medical Device Regulation (MDR) efforts and investment, which went into force in May 2021.
The European Parliament passed draft AI Act legislation on June 14, despite the protestations of industry bodies such as MedTech Europe, the Confederation of European Business and Team-NB (The European Association for Medical devices of Notified Bodies). In a recent Clarivate webinar, panelist Susana de Azevedo Wäsch of Ypsomed AG alluded to the difficulty of regulations catching up and keeping up with AI-informed devices, calling the present situation a “regulatory lasagne”, a concept already well known to the European regulatory affairs community.
While industry bodies have their doubts about the feasibility of the AI Act in action, the Act will now enter trilogue discussions between the European Commission, the European Parliament, and the Council of the E.U. Upon the adoption and entry into force of the AI Act, a general transition period of three years is expected, allowing businesses and stakeholders to adjust their operations and align their products with the requirements of the Act.
Risk and responsibility
Akin to the MDR, the new rules follow a risk-based approach and establish obligations for providers and users depending on the attendant level of risk for an algorithm. AI systems posing risks deemed unacceptable are strictly prohibited, including systems that deploy subliminal or purposefully manipulative techniques, exploit people’s vulnerabilities, or are used for social scoring.
Under the AI Act, medical or in vitro diagnostic medical devices that are AI-powered or incorporate AI as a safety component come under both the MDR/IVDR and the AI Act. Legal professionals generally do not see a problem with this, but some have expressed concern that the body has failed to harmonize these regulations to avoid contradictions. Ultimately, the AIA will determine how and if new AI-enabled medical technologies will be placed on the market.
One of the challenges for medtech manufacturers is that so much of the AI Act does not directly apply to medical devices. One noteworthy provision in the draft text is found in Recital 30, which stipulates that any medical device incorporating AI components and requiring the involvement of a notified body during a conformity assessment under the MDR or In Vitro Diagnostics Regulation (IVDR) would automatically be classified as a high-risk device under the AI Act.
Under the existing MDR framework, medical devices fall into different risk categories (i.e., I, IIa, IIb, and III, with higher numbers indicating higher risk). However, the European AI Act introduces a shift in risk classification for devices incorporating AI components. Devices categorized as IIb and IIa under the MDR, which were previously considered medium risk, will now be classified as high risk under the AI Act and will need to be assessed separately for conformity with the new AI legislation rules. There will be an exception for low-risk devices or diagnostics that do not require a notified body conformity assessment under the MDR or IVDR to be placed on the E.U. market.
There’s concern as well about other E.U. legislation that interacts with the AI Act, said Mark McCarty, Regulatory Editor of BioWorld Medtech.
“One of these other legislative considerations is a proposal by the European Commission to expand product liability, said McCarty. “While the provisions for AI products are to some extent directed toward hiring decisions, this AI Liability Directive also carries a presumption of causality that developers of medical AI software may find problematic when it comes to product liability litigation.”
On the other hand, one feature of the AI Act that could offer an opening for a regulatory mechanism similar to the predetermined change control plan (PCCP) that is the subject of an FDA draft guidance, said McCarty. Such a measure, accounting for the iterative quality of software development, would be a salutary addition from an industry perspective.
“Page 33 of the current version of the AI Act states that any changes to a machine learning algorithm that are spelled out in the application need not require a new conformity assessment,” he noted.
Companies face a learning curve in understanding and adapting to this new regulatory regime. Clarivate’s BioWorld MedTech news and analysis and regulatory consulting solutions can help companies keep abreast with this constantly evolving regulatory landscape.
How should medtechs prepare for the AIA?
It is currently unclear just how challenging it will be for manufacturers to demonstrate compliance with the AI Act in practice. Reclassification of devices under the AI Act necessitates separate conformity assessments for compliance with the new AI legislation rules. Medtech developers will need to obtain relevant certificates to show that their AI technologies meet the required standards of safety, accuracy, reliability, and ethics.
Medtechs will want to carefully evaluate the AI components within their devices and assess them for potential risks to patient safety, data security, and the integrity of medical processes. Companies may have to implement measures to ensure transparency for their AI systems, aspects emphasized by the AI Act. The Clarivate Cortellis Regulatory Intelligence Solution offers a comprehensive repository of regulatory intelligence that aids strategic decision making by providing timely coverage of global health authority requirements on healthcare products.
Alignment between the AI Act and sectoral legislation is critical to facilitating continued patient access to innovative healthcare products. While final implementation of the AIA may be years away, companies should begin preparing now to ensure continued access to European markets for their products.