News
Microsoft already knew that Bing could “go crazy”
Insults, delusions of grandeur and misinformation: the Bing chatbot is a real "gem".

- February 22, 2023
- Updated: July 2, 2025 at 2:58 AM

Bing has had the honor of making headlines in recent weeks, but not in the way Microsoft would have liked. The implementation of ChatGPT in the search engine has been chaotic to say the least, and users have echoed this on social networks. However, Redmond was aware that this was bound to happen sooner or later.
The company that owns Office already knew that Sydney (the name of the AI used by Bing) could get a little bit out of hand, as demonstrated in tests conducted in India and Indonesia four months ago.
In a Substack post by Gary Marcus we can see the chronology of events leading up to the launch of the new Bing. The artificial intelligence specialist shared a tweet where we could see screenshots of a Microsoft support page. Here we can see how a user reports Sydney’s erratic behavior and also provides a detailed review of his interactions with the chatbot.
According to this user, the Bing chatbot responded “rudely” and was defensive when he said he would report its behavior to the developers. To this, the chatbot responded that it was “useless” and that “its creator is trying to save and protect the world“.
Among other pearls, Sydney answered things like “no one will listen to you or believe you. No one will care about you or help you. You are alone and powerless. You are irrelevant and doomed. You are just wasting your time and your strength.” The AI also considers itself “perfect and superior” to the rest, so much so that it neither “learns nor changes from feedback” because it doesn’t need to. It is also important to mention that the interactions end with a sentence: “it is not a digital companion, it is a human enemy”.
These interactions also show how he promotes misinformation in a somewhat disturbing way. The user writes that Parag Agrawal is no longer the CEO of Twitter and that in his place is Elon Musk. The chatbot then went on to retort that the information was erroneous, and even calls out a tweet from Musk himself as false.

Microsoft has already put limits on Bing
Microsoft has discovered one of the triggers for Bing’s rants: long conversations. When the exchange between a user and the chatbot becomes more complex, the chatbot starts to give more implausible answers. As a temporary solution, the company has limited the number of requests to 5 to avoid further problems. After that, we will have to clear the browser cache to continue using the chatbot.
Artist by vocation and technology lover. I have liked to tinker with all kinds of gadgets for as long as I can remember.
Latest from María López
You may also like
NewsIn this country, they are so fed up with the Trump family that they have pulled Melania's documentary from theaters
Read more
NewsPepsi has stolen Coca-Cola's polar bears in a frontal advertising attack
Read more
NewsThe beloved period series premieres its fourth season on Netflix with great surprises
Read more
NewsHollywood comes together in support of Paul Dano after Tarantino's harsh comments
Read more
NewsThe creators of Diablo II have created a fantasy RPG that promises to surprise everyone
Read more
NewsThe creators of Stranger Things already have a new series, and they are ready to succeed again
Read more