News
Microsoft already knew that Bing could “go crazy”
Insults, delusions of grandeur and misinformation: the Bing chatbot is a real "gem".

- February 22, 2023
- Updated: July 2, 2025 at 2:58 AM

Bing has had the honor of making headlines in recent weeks, but not in the way Microsoft would have liked. The implementation of ChatGPT in the search engine has been chaotic to say the least, and users have echoed this on social networks. However, Redmond was aware that this was bound to happen sooner or later.
The company that owns Office already knew that Sydney (the name of the AI used by Bing) could get a little bit out of hand, as demonstrated in tests conducted in India and Indonesia four months ago.
In a Substack post by Gary Marcus we can see the chronology of events leading up to the launch of the new Bing. The artificial intelligence specialist shared a tweet where we could see screenshots of a Microsoft support page. Here we can see how a user reports Sydney’s erratic behavior and also provides a detailed review of his interactions with the chatbot.
According to this user, the Bing chatbot responded “rudely” and was defensive when he said he would report its behavior to the developers. To this, the chatbot responded that it was “useless” and that “its creator is trying to save and protect the world“.
Among other pearls, Sydney answered things like “no one will listen to you or believe you. No one will care about you or help you. You are alone and powerless. You are irrelevant and doomed. You are just wasting your time and your strength.” The AI also considers itself “perfect and superior” to the rest, so much so that it neither “learns nor changes from feedback” because it doesn’t need to. It is also important to mention that the interactions end with a sentence: “it is not a digital companion, it is a human enemy”.
These interactions also show how he promotes misinformation in a somewhat disturbing way. The user writes that Parag Agrawal is no longer the CEO of Twitter and that in his place is Elon Musk. The chatbot then went on to retort that the information was erroneous, and even calls out a tweet from Musk himself as false.

Microsoft has already put limits on Bing
Microsoft has discovered one of the triggers for Bing’s rants: long conversations. When the exchange between a user and the chatbot becomes more complex, the chatbot starts to give more implausible answers. As a temporary solution, the company has limited the number of requests to 5 to avoid further problems. After that, we will have to clear the browser cache to continue using the chatbot.
Artist by vocation and technology lover. I have liked to tinker with all kinds of gadgets for as long as I can remember.
Latest from María López
You may also like
NewsThe blame for 'Starfield' not succeeding lies with Bethesda itself, according to the creator of 'Skyrim'
Read more
News"Play it again and tell me if you really want to do it again." Do we really want a remake of 'Morrowind' or is it just nostalgia?
Read more
NewsCharlie XCX is very clear about whether she would make a song for James Bond. The answer may not please you
Read more
News'Wonder Man' breaks a very negative streak for Marvel… or so say its creators
Read more
NewsOne of the biggest stars of 'Scooby-Doo' reacts to the new Netflix series
Read more
NewsThe protagonist of 'The Breakfast Club' is very clear: there is no need to make remakes
Read more