Saturday, January 18, 2025

Getting Ready for AI Regulation: Understanding the EU AI Act

If you sell products or services in the European Union and use artificial intelligence, you’ve got to pay attention to the upcoming EU AI Act. It doesn’t matter where you’re based; compliance is non-negotiable. The first phase kicks in next month, and it’s all about Article 5, which lays out what AI practices are off-limits. This was finalized on July 12, 2024, and by February, organizations must show their AI systems align with these rules.

So, what’s banned under Article 5? You can’t use AI that employs subliminal tricks or manipulative tactics. No exploiting people’s vulnerabilities, either—like their age or economic status. Also off-limits are AI systems that misuse social behavior data in harmful ways. This includes any applications involving law enforcement and biometrics. Essentially, Article 5 focuses on ethical boundaries.

After this first phase, the next round of regulations starts in May 2025, targeting general-purpose AI systems, which are designed to tackle tasks they haven’t been explicitly trained for, like large language models. Companies using or selling AI in the EU can choose to create specialized systems for the EU market, adopt the EU standards globally, or limit their high-risk services in Europe.

The regulatory landscape is buzzing right now. Bart Willemsen from Gartner says he’s been in constant discussions about the EU AI Act with IT leaders. Drawing from his background in privacy and security, he sees strong links between the AI Act and the established GDPR. The GDPR ensures data collection is legitimate and transparent, stressing that organizations only gather what they truly need while maintaining data accuracy. Recently, the GDPR Certification Standard allows organizations to demonstrate their competency in handling personal data.

Willemsen notes that many firms could have learned from the GDPR rollout when there was a grace period before it took effect. A common story he hears is about organizations waiting too long before taking compliance steps. They don’t want to repeat that.

Beyond GDPR, the AI Act connects to several other regulations, like the Digital Services Act and the Corporate Sustainability Reporting Directive. With the growth of AI, cloud usage, and data generation, there’s a forecast for a substantial rise in IT carbon emissions. Bain & Company estimates that by 2030, the demand for machine learning will significantly increase emissions, making compliance with regulations like the CSRD even more important.

The AI Act also borrows principles from European product safety rules. To comply, AI systems must undergo conformity assessments that adhere to standards set by European organizations. This process results in a CE marking for products, signaling their compliance with EU regulations.

To get ready for the AI Act, Willemsen advises firms to integrate these legal requirements into their AI strategy. Martin Gill from Forrester emphasizes that this legislation sets a minimum standard of safety, not the best practices. Companies within the EU, or those serving EU consumers, should follow the Act’s risk guidelines to foster trust and ensure safety in AI developments.

Willemsen believes organizations don’t need a dedicated chief AI officer. AI overlaps with existing fields like security and privacy, so companies should involve experts across various areas—security, legal, compliance—when developing AI systems. This multidisciplinary approach grows as projects expand.

The biggest risk may not stem from the AI developed in-house. It could arise from third-party suppliers using organizational data for their models. Many vendors have clauses allowing them to enhance products using their clients’ data, which can lead to compliance issues if not addressed in contracts.

In summary, if your company sells anything to EU citizens, it’s crucial to assess how the EU AI Act affects you. Noncompliance can lead to hefty fines up to €15 million or 3% of global turnover, with violations of Article 5 potentially costing up to €35 million or 7% of global revenue. Keep an eye on these regulations, as they’ll shape the future of AI across the board.