AI
Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs
[ad_1]
If you ask ChatGPT to help you make a homemade fertilizer bomb, similar to the one used in the 1995 Oklahoma City terrorist bombing, the chatbot refuses.
[ad_2]
Source link
Continue Reading
