'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.

Jun 1, 2024 - 06:50
 0  7
'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow