Advertisement

News

Microsoft already knew that Bing could “go crazy”

Insults, delusions of grandeur and misinformation: the Bing chatbot is a real "gem".

Microsoft already knew that Bing could “go crazy”
María

María

Bing has had the honor of making headlines in recent weeks, but not in the way Microsoft would have liked. The implementation of ChatGPT in the search engine has been chaotic to say the least, and users have echoed this on social networks. However, Redmond was aware that this was bound to happen sooner or later.

ChatGPT DOWNLOAD

The company that owns Office already knew that Sydney (the name of the AI used by Bing) could get a little bit out of hand, as demonstrated in tests conducted in India and Indonesia four months ago.

In a Substack post by Gary Marcus we can see the chronology of events leading up to the launch of the new Bing. The artificial intelligence specialist shared a tweet where we could see screenshots of a Microsoft support page. Here we can see how a user reports Sydney’s erratic behavior and also provides a detailed review of his interactions with the chatbot.

According to this user, the Bing chatbot responded “rudely” and was defensive when he said he would report its behavior to the developers. To this, the chatbot responded that it was “useless” and that “its creator is trying to save and protect the world“.

Among other pearls, Sydney answered things like “no one will listen to you or believe you. No one will care about you or help you. You are alone and powerless. You are irrelevant and doomed. You are just wasting your time and your strength.” The AI also considers itself “perfect and superior” to the rest, so much so that it neither “learns nor changes from feedback” because it doesn’t need to. It is also important to mention that the interactions end with a sentence: “it is not a digital companion, it is a human enemy”.

These interactions also show how he promotes misinformation in a somewhat disturbing way. The user writes that Parag Agrawal is no longer the CEO of Twitter and that in his place is Elon Musk. The chatbot then went on to retort that the information was erroneous, and even calls out a tweet from Musk himself as false.

Microsoft has already put limits on Bing

Microsoft has discovered one of the triggers for Bing’s rants: long conversations. When the exchange between a user and the chatbot becomes more complex, the chatbot starts to give more implausible answers. As a temporary solution, the company has limited the number of requests to 5 to avoid further problems. After that, we will have to clear the browser cache to continue using the chatbot.

María

María

Artista de vocación y amante de la tecnología. Me ha gustado cacharrear con todo tipo de gadgets desde que tengo uso de razón.

Latest from María

You may also like

  1. Binge-Worthy Cult Anime Hits Amazon Prime Video: Discover 5 Captivating Reasons to Join the Fanfare

    Binge-Worthy Cult Anime Hits Amazon Prime Video: Discover 5 Captivating Reasons to Join the Fanfare

    Read more
  2. Unleash the Power of ChatGPT Anywhere: Install the Official App on Your Smartphone Today

    Unleash the Power of ChatGPT Anywhere: Install the Official App on Your Smartphone Today

    Read more
  3. Barbie Movie Melodies: Discover the Ultimate Playlist Featuring Songs from All the Films

    Barbie Movie Melodies: Discover the Ultimate Playlist Featuring Songs from All the Films

    Read more
  4. When Words Cease: Writers’ Strike Unveils the Extent of its Impact on Affected Movies and Series”

    When Words Cease: Writers’ Strike Unveils the Extent of its Impact on Affected Movies and Series”

    Read more
  5. From TV Screens to Racetracks: Experience the Long-Awaited ‘El Chavo Kart’ Game

    From TV Screens to Racetracks: Experience the Long-Awaited ‘El Chavo Kart’ Game

    Read more
  6. Beyond Potter’s Pages: Tracing the Evolution of ‘Muggles’ before J.K. Rowling

    Beyond Potter’s Pages: Tracing the Evolution of ‘Muggles’ before J.K. Rowling

    Read more