URL has been copied successfully!
Hacker Tricks ChatGPT to Get Details for Making Homemade Bombs – Information is key, but action opens the lock
URL has been copied successfully!

Hacker Tricks ChatGPT to Get Details for Making Homemade Bombs

A hacker known as Amadon has reportedly managed to bypass the safety protocols of ChatGPT, a popular AI chatbot developed by OpenAI, to generate instructions for creating homemade explosives. This incident raises significant questions about generative AI technologies’ security and ethical implications. How It Happened Amadon employed a technique known as >>jailbreaking<< to manipulate ChatGPT [...] The post Hacker Tricks ChatGPT to Get Details for Making Homemade Bombs appeared first on GBHackers Security | #1 Globally Trusted Cyber Security News Platform. First seen on gbhackers.com Jump to article: gbhackers.com/chatgpt-get-details/

Loading

Share via Email
Share on Facebook
Tweet on X (Twitter)
Share on Whatsapp
Share on LinkedIn
Share on Xing
Copy link