New jailbreak technique tricked ChatGPT into generating Python exploits and a malicious SQL injection tool. The post ChatGPT Jailbreak: Researchers By…
First seen on securityweek.com
Jump to article: www.securityweek.com/first-chatgpt-jailbreak-disclosed-via-mozillas-new-ai-bug-bounty-program/