Dans cet entretien, Me Renaud LE SQUEREN partage son analyse des implications de l’AI Act pour les entreprises, en particulier les startups, et l’écosystème luxembourgeois.

Lisez l’interview en cliquant ici.

Silicon Luxembourg: What do you consider the main implications of the AI Act?

Renaud LE SQUEREN: The AI Act comes from the standpoint that European countries are a bit late in the development of this technology. We’ve known from a theoretical standpoint for ages that AI could impact our daily lives and even democracy itself, as seen with the manipulation of votes during events like the Brexit referendum.

However, the European approach has not been to regulate the technology itself but to regulate its use. Regulation, as always, is a major issue for competition and development. The question is, how do we manage to arrive at a level playing field between European companies and others, like those from the US or Asia?

The AI Act will put significant pressure on companies, making it more expensive and challenging entities to innovate and use innovation in Europe. The lack of integration between various European regulations, such as the GDPR and other acts, means that companies might need to deal with multiple supervisory authorities and obligations even if they have one single product.

It’s very good on paper, but the way it has been implemented means that it remains challenging for companies. There is a need for simplification to avoid extensive costs and efforts.

Silicon Luxembourg: What are the most important uses the AI Act seeks to regulate?

RLS: The AI Act has been drafted to avoid potential abuses, such as the manipulation of opinion, both as a consumer and as a voter. We’ve seen how social media creates dependencies and manipulations, and AI can further enhance this impact. AI could be used to automate these processes, making them even more effective, in terms of creating dependencies and having other impacts. So, regulating these uses is crucial to protecting democratic values.

Business can only develop well in a sane society. If AI remains unregulated, it could lead to societal tensions and conflicts. We see these developments in the US, where political antagonism and societal conflict has grown. AI can increase the antagonism as well as the conflict. I’m thus very much in favor of this regulation for these reasons.

If we read this regulation one article at a time, the content offered is crucial and relevant, but if you look at it in its entirety, the requirements with which businesses need to comply, become a massive burden. This needs to be addressed to ensure that it’s manageable for companies.

Silicon Luxembourg: Do you think the AI Act is balanced as it currently stands?

RLS: No, it’s too complicated. Businesses have to start with a gap analysis to understand where they stand. You will have to draft and implement a lot of internal documentation, perform audits on service providers, and assessments of your technology to see if it fits within the scope of the AI Act. It’s a lot for few processes or activities to have to achieve, especially in the financial sector.

Now that all the regulations – I am talking about IA ACT, DSA, GDPR, DORA to name a few – are on the table, there should be a step at the level of the Commission to simplify this, to reduce the number of supervisory authorities, and align definitions in these regulations to avoid inconsistencies.

Silicon Luxembourg: What are the most important questions startups and businesses should ask themselves about the AI Act?

RLS: Startups will have to implement certain risk management procedures. If you are not ready to use AI, you will drown in the marketplace. At DSM, we use a lot of AI because it supports our daily work, but we are extremely careful in choosing and assessing these technologies.

Public funding should be put on the table to support the development of technology from a regulatory compliance standpoint. Generally, startups create technical tools, and dealing with regulation comes after that. This is a mistake. Regulation should be on the roadmap from day one.

Where should they start? Public administration should be able to give the first insights and a comprehensive view. There are firms and associations that can help, and they should connect with these networks to perform an initial gap analysis.

Silicon Luxembourg: How do you suggest companies prepare to best mitigate any potential legal risks?

RLS: Companies should connect with their respective networks, such as Luxinnovation and other public operators, to get an initial overview. They should also link with associations and law firms to perform an initial gap analysis.

The second step would be to develop their financial projections with legal and regulatory advice in mind. Often, startups don’t include legal matters in their business plans, which is a big mistake.

Startups underestimate the importance of legal compliance. At a certain point, the development of their product is blocked because the product is technically ready but not legally compliant, preventing access to the market.

Silicon Luxembourg: What services does DSM offer that would be useful for companies working with AI technologies?

RLS: At DSM Avocats à la Cour, we offer gap analysis services. Our team understands both the technology and the regulations. We simplify and translate the legal obligations for our clients, helping them to efficiently navigate compliance obligations. We are precise with technical details but ensure that the meaning and purpose of regulations are clear. This is our best asset.

We are able to make people understand and comply with everything from a legal standpoint, ensuring they are prepared for market access and investor confidence.