Article 46

Derogation from conformity assessment procedure

1.   By way of derogation from Article 43 and upon a duly justified request, any market surveillance authority may authorise the placing on the market or the putting into service of specific high-risk AI systems within the territory of the Member State concerned, for exceptional reasons of public security or the protection of life and health of persons, environmental protection or the protection of key industrial and infrastructural assets. That authorisation shall be for a limited period while the necessary conformity assessment procedures are being carried out, taking into account the exceptional reasons justifying the derogation. The completion of those procedures shall be undertaken without undue delay.

2.   In a duly justified situation of urgency for exceptional reasons of public security or in the case of specific, substantial and imminent threat to the life or physical safety of natural persons, law-enforcement authorities or civil protection authorities may put a specific high-risk AI system into service without the authorisation referred to in paragraph 1, provided that such authorisation is requested during or after the use without undue delay. If the authorisation referred to in paragraph 1 is refused, the use of the high-risk AI system shall be stopped with immediate effect and all the results and outputs of such use shall be immediately discarded.

3.   The authorisation referred to in paragraph 1 shall be issued only if the market surveillance authority concludes that the high-risk AI system complies with the requirements of Section 2. The market surveillance authority shall inform the Commission and the other Member States of any authorisation issued pursuant to paragraphs 1 and 2. This obligation shall not cover sensitive operational data in relation to the activities of law-enforcement authorities.

4.   Where, within 15 calendar days of receipt of the information referred to in paragraph 3, no objection has been raised by either a Member State or the Commission in respect of an authorisation issued by a market surveillance authority of a Member State in accordance with paragraph 1, that authorisation shall be deemed justified.

5.   Where, within 15 calendar days of receipt of the notification referred to in paragraph 3, objections are raised by a Member State against an authorisation issued by a market surveillance authority of another Member State, or where the Commission considers the authorisation to be contrary to Union law, or the conclusion of the Member States regarding the compliance of the system as referred to in paragraph 3 to be unfounded, the Commission shall, without delay, enter into consultations with the relevant Member State. The operators concerned shall be consulted and have the possibility to present their views. Having regard thereto, the Commission shall decide whether the authorisation is justified. The Commission shall address its decision to the Member State concerned and to the relevant operators.

6.   Where the Commission considers the authorisation unjustified, it shall be withdrawn by the market surveillance authority of the Member State concerned.

7.   For high-risk AI systems related to products covered by Union harmonisation legislation listed in Section A of Annex I, only the derogations from the conformity assessment established in that Union harmonisation legislation shall apply.

Frequently Asked Questions

High-risk AI systems can be temporarily allowed onto the market without undergoing standard conformity checks only under exceptional situations such as protecting public security, safeguarding human lives, protecting the environment, or securing critical industries, as long as the proper checks are completed as soon as possible afterwards and justification is provided by the relevant authorities.
Yes, in urgent cases where public security or safety of individuals is seriously threatened, law enforcement or emergency services can immediately use a high-risk AI system without prior approval, but they must seek official authorization promptly afterward and must shut down the system immediately if authorization is later denied, disposing of all collected information.
The relevant market surveillance authority initially checks if temporary urgent use of high-risk AI is justified and aligns with safety and legal requirements, but if other Member States or the European Commission raise concerns, further consultations and a final evaluation by the Commission determine whether authorization remains valid or must be withdrawn.
Yes, the European Commission and other Member States review these temporary permissions and have 15 days to object; if objections arise, the Commission consults with all parties involved, including AI system operators, and makes a final decision on whether the authorization is lawful or should be cancelled.

AI literacy

Book Demo

We will get back to you via email as soon as possible.