EU's AI act - Risks and opportunities to decide on
Ekan Management offers support for carrying out risk analysis, defining strategies and implementing measures linked to the EU's new AI regulation.
The EU's new regulation to regulate the use of AI within the Union (simply called the "EU AI act") is about to enter into force. The regulation is fully applied in Sweden from the turn of the year 2026. It has already started to have consequences. Perhaps Apple's decision not to include their new AI support in the phones sold in the European market has received the most attention. This is because the law is perceived as partially difficult to interpret and restrictive.
Seen as both a threat and necessity
The purpose of the law is to create a safe and transparent AI landscape, while promoting innovation. The law has been criticized for being too draconian and risks inhibiting desirable development and application of AI. At the same time, it is highlighted as the world's first regulation that specifically ensures the individual's privacy and security linked to how AI is used. As usual, it is difficult to predict how the law will be interpreted and applied. Few are willing to knowingly take risks, given that the fines could land €30 million, or more.
Regardless of how one perceives that the legislator has managed to balance his considerations, there are both risks and opportunities with a comprehensive legislation that actors in the EU market need to act on.
Definitions and risk levels
The law refers to the system's entire life cycle, i.e. both systems that are developed and that are fully implemented. The law is based on a risk-based approach where AI applications are divided into five risk levels:
- Minimal/no risk (allowed without restrictions)
- Limited risk (allowed with moderate regulation)
- Systemic risk (allowed with regulation)
- High risk (allowed with strict regulation)
- Unacceptable risk (prohibited)
An important aspect is that, according to the regulation, it is not up to the manufacturer or the user to adetermine whether a particular piece of software is an AI system. Authorities will do that based on EU rules. And if the AI system falls within a high-risk category, one's operations and systems must be in compliance with the rules.
Affected
Companies and businesses therefore need to inventory their AI systems themselves according to risk level, which requires detailed insight into how the systems work. This can be complicated, especially for companies that use many different AI solutions or where AI is integrated into existing platforms. Since the regulation aims to regulate the systems, it means that most companies and businesses can be affected.
Municipalities and regions are affected to a large extent by the areas relevant to high-risk AI. Examples are medical equipment, solutions used to assess whether natural persons are entitled to benefits and solutions intended to be used to influence the outcome of an election or referendum.
Ekan are experts in risk analysis and strategy development
A first step is to shine a light on your business in order to identify and carefully list your AI solutions in order to be able to carry out a risk analysis. It is common for several AI-based systems to be used within one and the same organization. Some are tailor-made, while others are integrated into platforms used daily, such as AI-powered off-the-shelf software or SaaS solutions, making them harder to detect. In some cases, organizations use hundreds of different AI systems in different parts of their value streams. By mapping these products and functions, both the systems and the business/activity can be analyzed and categorized according to the risk levels specified in the AI legislation.
Ekan Management offers support in risk analysis of businesses in order to be able to develop customized solutions to manage risks, problem areas and opportunities. We look forward to discussing how we can support you in this important transition. Together, we can ensure that your AI strategy is both forward-looking and compliant with the new regulation.
Curious to learn more?