It seems that the search engine not only gave literally tons of meaningless answers to questions they “didn’t understand”, but gave them hostile answers.

For example, the system told one of the testers that she was unhappy in marriage and fell in love with a robot, while the other was compared to some kind of negative character. However, a major factor that reduced Bing’s opportunities to engage in conversations with users was the search engine’s inability to detect the word “emotions” and anything related to that concept.

In this case, the system terminates the dialog. The same goes for the “Sydney” alias used by the Bing development team for internal communication.

Source: Ferra

Previous articleAirPods Pro 2 hits all-time low at Amazon
Next articleUnlabeled items will disappear at Wildberries from March 1Science and technology18:11 | February 23, 2023
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here