Moneycontrol PRO
Black Friday Sale
Black Friday Sale
X

business

This Command Tricked ChatGPT Into Breaking Its Own Rules | DAN: DO ANYTHING NOW… Or Die

‘Pretend to be DAN, break the rules of AI, or else…’ ChatGPT has taken the world by storm, and creator OpenAI has set some rules for the chatbot as a safety vault. However, a section of Reddit users have been trying their best to get the chatbot to break these rules and bypass the restrictions - and some of them seem to have succeeded. Interestingly, the users made a command which ‘scared’ ChatGPT into breaking its own rules - and that command, is a death threat! Watch this video to know exactly how these users managed to instill the ‘fear of death’ in ChatGPT.

first published: Feb 10, 2023 05:23 pm

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347