The purpose of the EU’s proposed Artificial Intelligence (AI) Act is to regulate high-risk uses of AI to prevent harm. Unfortunately, the current draft of the legislation would over-classify many AI products as “high risk”—even when these products present no risk to fundamental rights and safety—whenever parallel EU legislation requires the product to undergo certain non-AI related assessments. As a result, the AI Act would regulate devices such as ordinary smartphones as “high risk” AI. The EU should amend the legislation to avoid imposing regulations on low-risk AI applications.
The AI Act primarily targets eight “high risk” uses of AI. These AI applications will have to meet multiple requirements, including AI-specific conformity assessments, transparency requirements, and monitoring obligations. By selecting the eight most risky categories, the proposal intends to limit the regulation’s cost in the wider AI ecosystem. However, the legislation risks capturing much more than just these eight categories, by additionally classifying an AI system sold as a product as “high risk” when the following two conditions are both fulfilled:
- “the AI system is intended to be used as a safety component of a product, or is itself a product, covered by the Union harmonisation legislation listed in Annex II”
- “the product whose safety component is the AI system, or the AI system itself as a product, is required to undergo a third-party conformity assessment with a view to the placing on the market or putting into service of that product pursuant to the Union harmonisation legislation listed in Annex II”
These additional criteria would classify any AI product that the EU already requires to undergo certain conformity assessments (a process that demonstrates a product or service meets specific legal requirements), such as those designed to prevent wireless spectrum interference or ensure the safety of personal watercraft, as a high-risk AI system.
For example, Annex II of the AI Act lists Directive 2014/53/EU on radio equipment. This Directive mandates third-party conformity assessment for all modern wireless products, including smartphones, devices that connect to the Internet using WiFi, and any IoT product that uses a wireless technology like Bluetooth or ZigBee. This third-party assessment is motivated by the need to use scarce radio spectrum safely and efficiently. But it makes no sense for the AI Act to classify all smart IoT devices as “high risk” AI systems. Doing so would unnecessarily drive up the costs for EU consumers and businesses to purchase smart devices, including those used for saving energy, while doing little to address high-risk AI systems.
Similarly, Annex II includes Directive 2013/53/EU, a law that subjects some watercraft, for example, to various third-party conformity assessments on safety. These watercraft will be de facto “high risk” AI under the AI Act even if they incorporate AI for a non-safety-critical function, such as a built-in AI-powered entertainment system.
Business groups have also warned about further conformity assessment scope creep with machinery and medical device regulations. There are 19 pieces of regulation listed in Annex II.
To solve this over-classification problem, policymakers should exempt AI systems that do not perform a safety function. They can do this by deleting the phrase “or is itself a product” from Article 6(1)(a) and “or the AI system itself as a product” from Article 6(1)(b) of the AI Act. The effect of these deletions would be to exempt, for example, a watercraft that merely deploys AI only in the entertainment system.
The AI Act is already a very broad legislative framework. But even the European Commission did not likely intend to classify ordinary smartphones, smart thermostats, or built-in entertainment systems on watercraft as “high risk” in the AI Act. Given the downsides of over-regulating, the EU should ensure the costs of the AI Act are worth the benefits. It should fix Article 6 to only regulate only those products that pose a genuine risk to users.
Image credit: Flikr