If you are a user of the new Bing with ChatGPT, be careful with the information you share. A recent change to the Terms of Use states that Microsoft can store your chatbot conversations. While the company says the new policy is meant to ensure the proper use of artificial intelligence, some see it as too intrusive.

As reported RegisterJuly 30, 2023 update adds sections, Fighting the Misuse of Bing for Insider Information. Users will not be able to use the chatbot to discover any underlying components of models, algorithms, and systems. You are also not permitted to use methods to extract or collect web data from the Service to create, train, or enhance other artificial intelligence models.

Microsoft is serious about protecting your browser and is determined to punish anyone who breaks the rules. If we remember, Bing’s early days with ChatGPT were chaotic for the company after several users tested its security. One of them forced Bing to reveal its codename and how it works.

Kevin Liu, security expert, accessed confidential information through quick hack, a technique that tricks a chatbot into answering unauthorized questions. In addition to Liu, other analysts managed to make him plan attacks or go crazy after a long conversation. With this in mind, Microsoft has closed the gap and updated its security barriers to prevent further information leaks from Bing.

Microsoft takes aggressive stance on Bing defense

Microsoft Bing AI with ChatGPT
Microsoft Bing with ChatGPT by Midjourney AI

The competition in the AI ​​race has forced many companies to change their stance on protecting their technology. Just like Twitter or Reddit Microsoft doesn’t want third parties to have access to the dataset, models, or algorithms behind Bing. with ChatGPT.

“As part of the provision of the AI ​​services, Microsoft will process and store your service inputs and service outputs to monitor and prevent misuse or malicious use or output,” the Terms update says. Services.

Microsoft does not indicate how long your hints, although judging by the reason it could be 30 days. Metaservices keep copies of your messages for a similar period of time. In emergency situations or on accounts related to criminal investigation, data is stored for up to 90 days.

The only way to prevent Microsoft from saving your Bing conversations is to be an enterprise edition user. Policies for the ChatGPT browser vary by company and do not allow the technology to store history or train its models using the information you share.

The new policy will come into effect on September 30, 2023.so you still have time to delete your chat history.

Source: Hiper Textual

Previous articleA venture fund was launched in Russia to support start-ups in the field of electric propulsion.
Next articlePornhub launched a section with short vertical videos
I am Garth Carter and I work at Gadget Onus. I have specialized in writing for the Hot News section, focusing on topics that are trending and highly relevant to readers. My passion is to present news stories accurately, in an engaging manner that captures the attention of my audience.

LEAVE A REPLY

Please enter your comment!
Please enter your name here