Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs

An explosives expert told TechCrunch that the ChatGPT output could be used to make a detonatable product and was too sensitive to be released.

© 2024 TechCrunch. All rights reserved. For personal use only.

This article has been indexed from Security News | TechCrunch

Read the original article: