Alerts
Events
DCR
Explore Cyware Products
Alerts
Events
DCR
Go to listing page
Hacker Tricked ChatGPT Into Providing Detailed Instructions to Make a Homemade Bomb
Trends, Reports, Analysis
September 17, 2024
Security Affairs
A hacker tricked ChatGPT into providing detailed instructions on how to make homemade bombs by bypassing safety guidelines. The hacker used a 'jailbreaking' technique, posing the request as part of a fictional game, to deceive the system.
Read More
ChatGPT
AI Chatbot
Jailbreak Attack
Amadon
Homemade Bomb
Publisher
Previous
Russia’s RT News Agency Has ‘Cyber Operational Capabili ...
Geopolitical, Terrorism
Next
Strider Secures $55M to Fuel AI Growth and Global Expan ...
Companies to Watch