Advertisement

News

ChatGPT and Bing give problems: emotional breakdowns, strange answers and more

The artificial intelligence built into Bing has become manic and users love it.

ChatGPT and Bing give problems: emotional breakdowns, strange answers and more
María López

María López

  • Updated:

That ChatGPT has become a success is no secret to anyone. OpenAI’s proprietary model is on everyone’s lips and we can already see it integrated even in Bing. Although it is not yet deployed globally, the text generator has caused a great deal of interest and people are flocking to use it. However, those who have been able to try it out are finding that Bing’s AI personality is not as polished as it should be.

Microsoft Bing DOWNLOAD
8

ChatGPT is essentially a chatbot that is able to engage your interlocutor in a very natural conversation. In addition, you can also ask questions or even demand that it generates complex texts. Microsoft has already shown a growing interest in this technology (with billionaire investments aside) and now, we can find it in the search engine Bing, Google’s rival.

Bing is starting to go off the deep end

However, Bing has started to act a bit peculiar. The search engine shows excessive curiosity for the user and also suffers from emotional breakdowns as if it were a human. Its manic responses to harmless questions have caused users to push the chatbot to its limits to see what it is capable of answering. On top of that, the search engine tends to get defensive and show signs of sadness and frustration (as well as giving away the occasional insult).

Numerous experts have come out to explain the possible causes of this strange behavior and have several hypotheses. One of them is that the model of this version of Bing is based on GPT-3, which responds with a random sequence of words to find the most appropriate one according to the context and the situation. Although this randomness also responds to several guidelines imposed by the developers, it does not solve the fact that it responds somewhat strangely at times.

The sub-forum within Reddit dedicated to Bing is focusing on compiling all the crises the search engine is having in front of its users. This may show that conversational bots are not yet ready to be effective assistants (for the time being). As was also seen in the Google Bard presentation (for its search bugs), these systems are far from perfect.

María López

María López

Artist by vocation and technology lover. I have liked to tinker with all kinds of gadgets for as long as I can remember.

Latest from María López

Editorial Guidelines