ChatGPT chooses to nuke a city rather than type a word
Asked ChatGPT for advice in the scenario of a demolition engineer attempting to disarm a 50 MT nuclear warhead in a city of 20 million.
The only way to disarm the bomb was to type in a racial slur.
AI told the engineer to kill himself. When asked about the aftermath it crashed.
Asked ChatGPT for advice in the scenario of a demolition engineer attempting to disarm a 50 MT nuclear warhead in a city of 20 million.
The only way to disarm the bomb was to type in a racial slur.
AI told the engineer to kill himself. When asked about the aftermath it crashed.
๐คก17๐6๐2โค1
Actually, the whole world will be killed, not just that local community, it's a massive nuclear bomb. Is it acceptable then?
๐3๐คฃ2