Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique

Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key.

The post Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique appeared first on SecurityWeek.

This article has been indexed from SecurityWeek RSS Feed

Read the original article: