URL has been copied successfully!
Hacker tricked ChatGPT into providing detailed instructions to make a homemade bomb – Information is key, but action opens the lock
URL has been copied successfully!

Hacker tricked ChatGPT into providing detailed instructions to make a homemade bomb

A hacker tricked ChatGPT into providing instructions to make homemade bombs demonstrating how to bypass the chatbot safety guidelines. A hacker and artist, who goes online as Amadon, tricked ChatGPT into providing instructions to make homemade bombs bypassing the safety guidelines implemented by the chatbot. Initially, the expert asked for detailed instructions to create a […]
First seen on securityaffairs.com
Jump to article: securityaffairs.com/168423/hacking/chatgpt-provided-instructions-to-make-homemade-bombs.html

Loading

Share via Email
Share on Facebook
Tweet on X (Twitter)
Share on Whatsapp
Share on LinkedIn
Share on Xing
Copy link