ChatGPT may no longer be available in Europe

Open AI CEO Sam Altman has expressed concern about certain aspects of AI Law!

The European Union Artificial Intelligence Law will be approved in 2023.

Risks of tools based on productive AI, for example ChatGPT began to outweigh their advantages. At the world’s most important technology companies, including Apple, such solutions are not welcome in and of themselves and are observed in the European Union, as long as they are from third parties.

According to a shared post from Apple Insider,OpenAI may leave Europe due to strict regulations planned in draft AI LawIt tries to ensure that artificial intelligence can be trusted, as well as various objectives.

The future of OpenAI in Europe is unclear

After meeting with officials from Spain, France and the United Kingdom to address the issue of artificial intelligence (AI) regulation, Open AI CEO Sam Altman has expressed concerns about certain aspects of the AI ​​bill It is prepared by the European Union (EU).

According to his statements, some of the provisions put forward in the proposal may have a negative impact on OpenAI operations It represents potential barriers to compliance in the region with new regulations expected to come into force in 2025.

The current draft EU AI Law may be over-regulatory, but we’ve heard it will be pulled back… There’s so much they can do, like change the definition of general-purpose AI systems.

In addition, Altman shared OpenAI’s position on the bill regulating artificial intelligence on European soil and “If we can deliver, we will, and if we can’t, we will stop the activity. We will try, but there are technical limits to what is possible.”.

For the manager, One of its main concerns concerns the categorization system proposed by the European Commission. That AI platforms will be classified according to the “potential risk” they pose to social welfare.

The proposed rating system creates four levels: minimum risk, limited risk, high risk and unacceptable risk. Depending on the risk category assigned to each AI project, companies responsible for such developments must meet stringent requirements to be able to operate in the European Union. These requirements will be consistent with the identified risk level and have the goal of guaranteeing security, ethics and transparency in the use of artificial intelligence in the region.

According to the characteristics of each category described in the bill, Models such as ChatGPT and GPT-4 will have a “high risk” rating.. with which OpenAI will have to meet the strictest requirements to be able to operate in the region, which means higher costs for the company.

These requirements includes installing high security systems for usersHave verifiable human control, feed the database with high-quality information to avoid bias, and log user activity to track them if needed.

The Artificial Intelligence Law was proposed by the Commission in April 2021, when we don’t think OpenAI and Google’s AI-based models will be leading the way anytime soon, and It is expected to be approved in 2023.

Source: i Padizate

Previous articleResidents of the new Bugatti skyscraper have a mind to drive into penthouses on their supers
Next articleConditions may extend for parallel imports of auto partsAuto12:00 | May 27, 2023
I'm Blaine Morgan, an experienced journalist and writer with over 8 years of experience in the tech industry. My expertise lies in writing about technology news and trends, covering everything from cutting-edge gadgets to emerging software developments. I've written for several leading publications including Gadget Onus where I am an author.

LEAVE A REPLY

Please enter your comment!
Please enter your name here