As part of the EU’s upcoming AI Act, negotiators are apparently discussing further restrictions for huge AI models such as OpenAI’s ChatGPT-4. According to Bloomberg, representatives in the European Union are apparently debating a plan for extra controls on the largest artificial intelligence (AI) systems.
The European Commission, European Parliament, and various EU member states are said to be discussing the potential effects of large language models (LLMs), such as Meta’s Llama 2 and OpenAI’s ChatGPT-4, and any additional restrictions that might be imposed as part of the upcoming AI Act.
According to Bloomberg, individuals close to the situation said the idea is not to burden young businesses with too many restrictions while keeping larger models under check. Negotiators’ agreement on the matter, according to sources, is still in the basic stages.
The AI Act and the newly proposed LLM laws would take a similar approach to the Digital Services Act (DSA) of the EU.
The DSA was recently introduced by EU lawmakers, requiring platforms and websites to have standards in place to protect user data and monitor for illicit activity. The web’s largest platforms, on the other hand, are subject to stronger rules. Companies in this category, such as Alphabet and Meta, had until August 28 to change their service practices to meet the new EU rules.
The EU’s AI Act is intended to be one of the first sets of mandatory AI laws enacted by a Western government. China has already adopted its own set of artificial intelligence legislation, which will go into force in August 2023.
Companies developing and deploying AI systems would be required to do risk assessments, label AI-generated content, and would be prohibited from utilising biometric surveillance, among other steps, under the EU’s AI legislation. However, the legislation has not yet been adopted, and member states can still disagree with any of the parliament’s suggestions. More than 70 new AI models have been released in China since the implementation of its AI rules, according to reports.