In just two months ChatGPT had an amazing membership of over 100 million users. The public was fascinated by the responses of the artificial intelligence powering the chatbot, but privacy issues began to emerge. However, regulatory authorities from several countries are tightening the right around OpenAI.

On March 20, a bug leaked data such as names, emails, addresses and credit card details of 1.2% of ChatGPT Plus subscribers. User conversation histories were also exposed. The vulnerability lasted nine hours and was discovered by cybersecurity firm Greynoise just four days later.

This has caused Italy’s data protection authority to question the transparency in the collection of user data. and request a temporary block from the AI. OpenAI has to pay a fine of 20m euros if the measure is not complied with. In addition, there are no chatbot services in 30 more countries.

How does ChatGPT use data?

ChatGPT uses a database of more than 300 billion words collected from books, articles, websites and posts from the internet to elaborate texts based on an algorithm. This makes it possible for the chatbot to answer questions on almost any topic with content very close to human language.

In its privacy policy, OpenAI automatically clarifies the collection of personal user data, the content used on the platform, information from social networks, navigation logs. According to the company, the goal is to “develop new programs and services”, prevent fraud and comply with legal obligations.

The collected data may be given to affiliates, service providers and OpenAI business partners without prior notice as well as complying with legal measures. According to the company, information is only kept for as long as it is necessary for business purposes and may be anonymized for research.

What are the privacy issues?

The ChatGPT privacy policy mentions the rights of users in the European Common Area, United Kingdom, Switzerland and California, but the Brazilian General Law on the Protection of Personal Data and the European General Data Protection Regulation.

OpenAI does not, for example, clarify exactly how long the information is stored or in which countries it is used. The chatbot does not specify exactly how users can revoke consent to the use of their personal data, nor does it make it transparent how this information is processed.

Additionally, there are copyright concerns. OpenAI is expected to earn more than $1 billion in 2024, but it does not pay to use information collected on the internet and generated by third parties. While proving plagiarism in texts is difficult, there are already lawsuits against generative display models such as Stability AI and Midjourney.

Should we stop using ChatGPT?

ChatGPT helps increase productivity in content production and discontinuing it does not seem to be an option. However, it is necessary to be careful with the information shared, as the risk of being used without the explicit consent of the user is great, especially if they request personal or commercial confidentiality.

Source: Tec Mundo

Previous articleApril 2023 launches on Disney+, Netflix, HBO Max, Amazon and Apple TV+
Next articleRussian Post will test unmanned cargo delivery on 33 routesScience and technology20:33 | 01 April 2023
I am a passionate and hardworking journalist with an eye for detail. I specialize in the field of news reporting, and have been writing for Gadget Onus, a renowned online news site, since 2019. As the author of their Hot News section, I’m proud to be at the forefront of today’s headlines and current affairs.

LEAVE A REPLY

Please enter your comment!
Please enter your name here