It seems that the search engine not only gave literally tons of meaningless answers to questions they “didn’t understand”, but gave them hostile answers.
For example, the system told one of the testers that she was unhappy in marriage and fell in love with a robot, while the other was compared to some kind of negative character. However, a major factor that reduced Bing’s opportunities to engage in conversations with users was the search engine’s inability to detect the word “emotions” and anything related to that concept.
In this case, the system terminates the dialog. The same goes for the “Sydney” alias used by the Bing development team for internal communication.
Source: Ferra

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.