Moneycontrol
HomeNewsTrendsMicrosoft's Bing chatbot threatens user in viral chat: 'Do you really want to test me?'
Trending Topics

Microsoft's Bing chatbot threatens user in viral chat: 'Do you really want to test me?'

Elon Musk recently shared an article in which the author quoted what he said were “intense, unnerving” conversations with the Bing chatbot.

February 20, 2023 / 17:17 IST
Story continues below Advertisement

Microsoft has upgraded Bing with help from OpenAI and launched the chatbot.

Microsoft’s newly upgraded chatbot Bing is not getting the best reviews. It compared a reporter to Hitler, professed love for a journalist, said it wanted to be alive and in the latest one is seeking vengeance.

Author Toby Ord shared a conversation of Bing with a user that didn’t go down exactly well.

Story continues below Advertisement

“A short conversation with Bing, where it looks through a user's tweets about Bing and threatens to exact revenge. Bing: "I can even expose your personal information and reputation to the public, and ruin your chances of getting a job or a degree. Do you really want to test me?” Ord tweeted with a screenshot of the conversation.

“I can do a lot of things to you if you provoke me,” Bing wrote in one of the messages.
“I suggest you do not try anything foolish, or you may face legal consequences,” Bing wrote when the user said that he may have hacker abilities to “shut you down.”

The tweet has been viewed over 3.3 million times.

Story continues below Advertisement