Below are the requirements for high-risk AI systems and their links to MDR/IVDR requirements from the perspective of a provider’s obligations. Note: any reference to a “manufacturer” in accordance with the MDR/IVDR are to be understood as references to a “provider” in accordance with the AI Act. All requirements for high-risk AI systems, as well as the obligations of providers and deployers of high-risk AI systems and other parties, are in Chapter 3, Sections 2 and 3 of the AI Act.
Management systems:
The AI Act introduces additional requirements for the management systems of high-risk MDAI in addition to the requirements of the MDR and IVDR. These include a system-specific quality management system (AIA, Article 17), which includes data management, transparency and human oversight, and continuous and iterative risk management (AIA, Article 9), which also covers fundamental rights risks and data biases. In addition, the AI Act places particular emphasis on secondary market monitoring for systems that learn after being put into service.
Technical documentation:
The MDR, IVDR and the AI Act require the provision of comprehensive technical documentation on MDAI systems. The MD and IVD regulations require detailed descriptions of software, software architecture, data processing methods and risk management strategies. The AI Act, on the other hand, requires additional documentation, focusing on transparency and responsibility. This includes risk assessments, information management practices and performance test results of high-risk MDAI systems. The AI Act requires preparing a single set of technical documentation for high-risk MDAI systems, containing the information required in both the medical device regulations and the technical documentation of the AI Act (AIA, Article 11, paragraph 2). The MDCG/AIB Guideline 2025-6 provides practical advice on how to combine the requirements of the MDR/IVDR and the AI Act into a single technical file.
Data and data management:
The MDR and IVDR require that clinical data be reliable and represent the intended use of the device. The AI Act, on the other hand, sets requirements for the quality, management and transparency of data (AIA, Article 10). The training, validation and test data of high-risk MDAI systems must have appropriate statistical characteristics and be as accurate, representative and appropriate as possible. Data protection and the original purpose of the data must be documented transparently. Manufacturers must implement procedures that ensure data integrity and the identification and prevention/mitigation of potential biases.
Collecting logs:
The AI Act requires for a high-risk MDAI system to automatically store log data throughout its lifecycle to ensure traceability (AIA, Article 12). This ensures that the functioning of the system is monitored, that risks affecting fundamental rights are monitored and that post-market surveillance is carried out.
Transparency and human oversight:
The AI Act and the MDR/IVDR impose obligations in terms of transparency and human oversight. Manufacturers must design and develop high-risk MDAI systems with sufficient transparency to allow users to correctly interpret the output and use the system properly (AIA, Article 13). This requires clear and comprehensible user instructions, information on the features, limitations and operating logic of the system, and documentation that supports the explainability of decisions. In addition, the AI Act calls for technical solutions that enable human intervention in critical decisions and ensure that the system cannot override these restrictions (AIA, Article 14). These obligations are complementary to the basic requirements of the MDR/IVDR, which highlight clear and comprehensive information on the purpose, function and limitations of a device, as well as documentation supporting traceability and safe use.
Precision, stability and cybersecurity:
The MDR, IVDR and AI Act emphasise the importance of cybersecurity. Manufacturers must implement technical solutions to prevent unauthorised access, attacks and manipulation. The AIA requires high-risk MDAI systems to include technical solutions to AI-specific vulnerabilities (AIA, Article 15). Cyber security is part of risk management, quality management and conformity assessment. The manufacturer of an MDAI system must secure the transfer and storage of data, prevent the manipulation of models and respond to cybersecurity incidents. These requirements apply to the entire life cycle and operating environment of a device.
Key responsibilities of the providers of high-risk AI systems and other operators
All the responsibilities of operators with regard to high-risk AI systems are stated in Chapter 3, Section 3 of the AI Act.
Main obligations of a provider
- implementation of risk and quality management systems
- record-keeping and compliance with transparency obligations
- system testing and conformity assessment before placing on the market or putting into service
- data quality and technical documentation
- CE marking and EU declaration of conformity
- registration obligation
- traceability and human oversight
- ensuring accuracy, stability and cybersecurity
- implementing corrective measures
- cooperation with competent authorities.
Main obligations of authorised representatives
- performing the tasks specified in the mandate
- keeping the necessary documents and ensuring that they are available to the competent authorities
- registration obligation as necessary
- cooperation with competent authorities.
The mandate of an authorised representative may not include tasks such as the design and manufacturing of an AI system, the preparation of technical documentation or conformity assessment.
Main obligations of an importer
- ensuring that the system meets regulatory requirements before being placed on the market
- keeping the necessary documents and ensuring that they are available to the competent authorities
- taking corrective measures
- informing the competent authorities if non-compliance is detected
- cooperation with competent authorities.
Main obligations of a distributor
- ensuring that the system meets regulatory requirements before being placed on the market
- keeping the necessary documents and ensuring that they are available to the competent authorities
- taking corrective measures
- informing the competent authorities if non-compliance is detected
- cooperation with competent authorities.
Main obligations of a deployer
- using the system in accordance with its user instructions and purpose
- monitoring the operation of the system and reporting irregularities.