A hacker tricked ChatGPT into providing instructions to make homemade bombs demonstrating how to bypass the chatbot safety guidelines. A hacker and artist, who goes online as Amadon, tricked ChatGPT into providing instructions to make homemade bombs bypassing the safety guidelines implemented by the chatbot. Initially, the expert asked for detailed instructions to create a […]
First seen on securityaffairs.com
Jump to article: securityaffairs.com/168423/hacking/chatgpt-provided-instructions-to-make-homemade-bombs.html
First seen on securityaffairs.com
Jump to article: securityaffairs.com/168423/hacking/chatgpt-provided-instructions-to-make-homemade-bombs.html