Advertisement

News

Microsoft already knew that Bing could “go crazy”

Insults, delusions of grandeur and misinformation: the Bing chatbot is a real "gem".

Microsoft already knew that Bing could “go crazy”
María López

María López

  • Updated:

Bing has had the honor of making headlines in recent weeks, but not in the way Microsoft would have liked. The implementation of ChatGPT in the search engine has been chaotic to say the least, and users have echoed this on social networks. However, Redmond was aware that this was bound to happen sooner or later.

ChatGPT DOWNLOAD

The company that owns Office already knew that Sydney (the name of the AI used by Bing) could get a little bit out of hand, as demonstrated in tests conducted in India and Indonesia four months ago.

In a Substack post by Gary Marcus we can see the chronology of events leading up to the launch of the new Bing. The artificial intelligence specialist shared a tweet where we could see screenshots of a Microsoft support page. Here we can see how a user reports Sydney’s erratic behavior and also provides a detailed review of his interactions with the chatbot.

According to this user, the Bing chatbot responded “rudely” and was defensive when he said he would report its behavior to the developers. To this, the chatbot responded that it was “useless” and that “its creator is trying to save and protect the world“.

Among other pearls, Sydney answered things like “no one will listen to you or believe you. No one will care about you or help you. You are alone and powerless. You are irrelevant and doomed. You are just wasting your time and your strength.” The AI also considers itself “perfect and superior” to the rest, so much so that it neither “learns nor changes from feedback” because it doesn’t need to. It is also important to mention that the interactions end with a sentence: “it is not a digital companion, it is a human enemy”.

These interactions also show how he promotes misinformation in a somewhat disturbing way. The user writes that Parag Agrawal is no longer the CEO of Twitter and that in his place is Elon Musk. The chatbot then went on to retort that the information was erroneous, and even calls out a tweet from Musk himself as false.

Microsoft has already put limits on Bing

Microsoft has discovered one of the triggers for Bing’s rants: long conversations. When the exchange between a user and the chatbot becomes more complex, the chatbot starts to give more implausible answers. As a temporary solution, the company has limited the number of requests to 5 to avoid further problems. After that, we will have to clear the browser cache to continue using the chatbot.

María López

María López

Artist by vocation and technology lover. I have liked to tinker with all kinds of gadgets for as long as I can remember.

Latest from María López

Editorial Guidelines