New: An artist/hacker said they found a way to trick ChatGPT into outputting detailed instructions for making fertilizer explosives.
-
New: An artist/hacker said they found a way to trick ChatGPT into outputting detailed instructions for making fertilizer explosives.
When we checked with an explosives expert who reviewed the chatbot’s output, the expert told us that the resulting instructions could be used to make a detonatable product and as such was too sensitive to be released.
A spokesperson for OpenAI did not respond to a request for comment.
From @lorenzofb: https://techcrunch.com/2024/09/12/hacker-tricks-chatgpt-into-giving-out-detailed-instructions-for-making-homemade-bombs/
Copyright © 2024 NodeBB | Contributors