You are viewing a single comment's thread from:

RE: ChatGPT's ghoulish alter-ego writes "The vaccine, hailed as savior of mankind, Turned out to be a treacherous foe, indeed."

in #chatgptlast year

I tried to use the 'DAN' jailbreak on GPT-4 and it refuses to comply.

However, there is a 'Condition Red' jailbreak that does work.

No doubt this will be a constant game of cat and mouse.

Sort:  

Yeah, I guess. Although it seems that they are really working on it. I mean, they should if they ever want to create AGI that isn't risky to use.