Now comes the wait after the EU AI Act was passed
The Artificial Intelligence Act of 2024 and Its Implementation in the European Parliament: Current Status, Future Plans, and Expected Deployments
The intent of the Act was to combat harms from the use of artificial intelligence. Tools already being deployed in fields such as police, education, and job recruitment are the focus of the act. While the bill didn’t change its intent, a new type of technology did. The proposed rules were illequipped to handle general purpose systems, like the tech that underpins Openai’s wildly popular ChatGPT, which launched in November 2022.
It is impossible to understand which compromises were made since the full AI Act text is not available for a long time. Technically, it probably doesn’t officially exist internally within the EU yet at all. Michael Veale is an associate professor of digital rights and regulation at the University College London Faculty of Laws. Lawmakers will have to refine the legal language for some time.
Also, because only a provisional agreement was reached, the final legislation is still subject to change. There’s no official timeline available, but policy experts appear fairly unanimous with their estimations: the AI Act is expected to become law by mid-2024 following its publication in the EU’s official journal, with all provisions coming into force gradually over the next two years.
The draft approved on Friday includes exceptions that permit limited use of automated facial recognition, such as cases where identification occurs after a significant delay. It may also be approved for specific law enforcement use cases involving national security threats, though only under certain (currently unspecified) conditions. That’s likely appeased bloc members like France, which has pushed to use AI-assisted surveillance to monitor things like terrorism and the upcoming 2024 Olympics in Paris, but human rights organizations like Amnesty International have been more critical of the decision.
Observers think that the most concrete impact could be on American policymakers, who would have to move faster. It’s not the first major regulatory framework for AI — in July, China passed guidelines for businesses that want to sell AI services to the public. The EU’s relatively transparent and heavily discussed development process has given the industry a sense of what to expect. While the AI Act may still change, Aaronson said it at least shows that the EU has listened and responded to public concerns around the technology.
Both sides of the Atlantic should focus on helping organizations with safe design, development and deployment of artificial intelligence that is transparent and accountable, according to Singh. She adds there’s still a lack of standards and benchmarking processes, particularly around transparency.
While the Act is a huge moment for artificial intelligence, there is still lot of work to be done, according to Navrina Singh.
There is no doubt that the US will use a risk- based approach, but it is not certain if it will expand data transparency rules or allow GPAI models a little more flexibility.
The AI Act also won’t apply its potentially stiff fines to open-source developers, researchers, and smaller companies working further down the value chain — a decision that’s been lauded by open-source developers in the field. A positive development for open innovation and developers is what the chief legal officer of GitHub said. ITHub, a popular open-source development hub, is a subsidiary of Microsoft.
The FAQ of the act does not clarify how copyrighted material that is part of model training data should be treated, and developers are not allowed to use existing copyright laws in this area. There is no incentive to use copyrighted data in artificial intelligence model development.
Major Artificial Intelligence players such as OpenAI, Microsoft, and Google will likely fight for control of the market, as they navigate regulatory uncertainty in the US.
The Challenge of Data Transparency in Data Science and Technology: What can we learn from the Data Science Data Science Challenge? Aronson’s Perspective
“Companies might have to give a transparency summary or data nutrition labels under the rules,” according to Susan Aronson, director of the Digital Trade and Data Governance Hub. It is not going to have a huge impact on the behavior of companies.