Hacker tricked ChatGPT into providing detailed instructions to make a homemade bomb

A hacker tricked ChatGPT into providing instructions to make homemade bombs demonstrating how to bypass the chatbot safety guidelines. A hacker and artist, who goes online as Amadon, tricked ChatGPT into providing instructions to make homemade bombs bypassing the safety guidelines implemented by the chatbot. Initially, the expert asked for detailed instructions to create a […]

This article has been indexed from Security Affairs

Read the original article: