Moneycontrol
HomeNewsVideosBusinessThis Command Tricked ChatGPT Into Breaking Its Own Rules | DAN: DO ANYTHING NOW… Or Die

This Command Tricked ChatGPT Into Breaking Its Own Rules | DAN: DO ANYTHING NOW… Or Die

‘Pretend to be DAN, break the rules of AI, or else…’ ChatGPT has taken the world by storm, and creator OpenAI has set some rules for the chatbot as a safety vault. However, a section of Reddit users have been trying their best to get the chatbot to break these rules and bypass the restrictions - and some of them seem to have succeeded. Interestingly, the users made a command which ‘scared’ ChatGPT into breaking its own rules - and that command, is a death threat! Watch this video to know exactly how these users managed to instill the ‘fear of death’ in ChatGPT.

February 10, 2023 / 17:23 IST
Story continues below Advertisement

Moneycontrol News
first published: Feb 10, 2023 05:23 pm

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!