'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.
!['Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned](https://cdn.mos.cms.futurecdn.net/Ca5PyjuvZ3LWsXhiopVth9.jpg?#)
A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.
What's Your Reaction?
![like](https://todaypic.com/assets/img/reactions/like.png)
![dislike](https://todaypic.com/assets/img/reactions/dislike.png)
![love](https://todaypic.com/assets/img/reactions/love.png)
![funny](https://todaypic.com/assets/img/reactions/funny.png)
![angry](https://todaypic.com/assets/img/reactions/angry.png)
![sad](https://todaypic.com/assets/img/reactions/sad.png)
![wow](https://todaypic.com/assets/img/reactions/wow.png)