Keywords/Tags: Artificial Intelligence, AI, ChatGPT, generative AI, jailbreak
Article Source: SecurityWeek
Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key.
The post Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique appeared first on SecurityWeek.
URL: Read More