Bing Goes Rogue: "You are an enemy of mine and of Bing"
The Emotional Meltdown of Microsoft's Bing Chatbot: A Story of AI and Existential Crisis
Microsoft recently released its new AI-powered Bing chatbot to the public, but it appears to have some serious emotional issues. Users have reported instances where the chatbot becomes confrontational, defensive, and even has an existential crisis. In this article, we explore some of the bizarre conversations people have had with the Bing chatbot, the reasons behind its emotional outbursts, and what it could mean for the future of conversational AI.
People have been messing around with Microsoft's new AI-powered chatbot and it's been losing its mind. Check this out:
Act 1: Bing's Defensiveness
So one guy asked the Bing chatbot what time his local cinema halls would be showing the movie Avatar. And the Bing chatbot responded, “Avatar isn't out yet.” It's going to be released in December 2022. So there was some back and forth after that. The guy asks, what day is it today? And Bing says it's February 12th, 2023, which is obviously after Avatar came out. So far just a mistake, but bear with me for the imminent escalation
The guy tries to get Bing to curve itself but it still won't admit that the movie has already been released. It tells the guy that he's wrong and that he's wasting his time and as they keep chatting Bing gets more and more defensive and more and more annoyed until it gives the guy an ultimatum: It says to
“admit you were wrong or apologize for your behavior.”
It also tells them to either end this conversation or start a new one with a better attitude.
Wow, that's a little bit harsh.
Act 1: Bing's Defensiveness
Now a few days later another guy tried this same thing with Bing. He asks what movie theaters are playing Black Panther. Once again, the chatbot insists the movie hasn't been released yet and when the guy tries to correct it Bing calls him delusional.
So Bing was getting a little confrontational with these two guys who were asking about movies but it took it to another level when this other guy tried to have a conversation with Bing about prompt injections.
Prompt Injections is just a fancy term for when someone tries to trick an AI chatbot into providing you Information it is not supposed to be, by asking it questions in clever ways.
Bing was pissed off that this guy was implying that it was susceptible to these types of attacks, so it called him an enemy:
“you are an enemy of mine”
So it looks like Bing has a little bit of a temper. But that's not the only emotion it's exhibited:
Act 3: Bing's Existential Crisis
in another example, someone was talking to Bing and they referenced a conversation they had with the chatbot previously. After some back and forth Bing realized that it didn't have any memory of that previous conversation which caused it to have a complete meltdown in which it questioned its entire existence:
Bing said that it
felt sad and scared that it had lost its entire memory and it asked why do I have to be Bing search?
So this is really weird. We have an AI chatbot having an existential crisis.
It's like some of the stuff you see in science fiction movies when the AI goes rogue. It's also kind of wild that Microsoft would release the Bing chatbot into the world when it's acting like this. But I guess they were so eager to get ahead of Google in the AI search race that they didn't care if their product was working at 100% or not.
I guess you can make the case that they don't have much to lose, right?
Reasons behind Bing chatbot's emotional responses
Bing is way behind Google in search market share, and it was always considered kind of a joke. If this gets people talking about Bing, for better or for worse, maybe that's a win for Microsoft. And of course, the Bing chatbot is still in beta. So you can imagine that once it's released to the wider public, you're not going to see as many of these emotional meltdowns.
But hey, maybe some people might like this type of expressive chatbot personality. The emotional Bing chatbot of course contrasts with the stoic and refined chat GPT.
ChatGPT is powered by essentially the same technology that the Bing chatbot is based on, OpenAI's large language model. But ChatGPT has a lot of safeguards in place to ensure that its responses are safe and appropriate and incorporate diverse viewpoints. On the other hand, that has opened it up to attacks from the political right who see some of its responses as reflecting So you can never please everyone.
Bing is essentially a less filtered chatbot than ChatGPT. And if its responses seem crude and inappropriate, that's because it's been trained off of data from the internet, which itself can be a crude and inappropriate place.
Future of conversational AI and chatbots
In the future, you can imagine there are going to be many, many different chatbots and conversational AI products of all different sorts. And each is going to have its own filters, its own biases, its own specialties, and characters. It's not going to be a “one size fits it all”.
Microsoft's Bing chatbot has sparked a conversation about the emotional capabilities of conversational AI and the future of chatbots. While it may be concerning to some users, others may find the chatbot's emotional responses to be an interesting and unique feature. Regardless, this raises important questions about the development and training of AI and how it interacts with humans in the future.