Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key. The post t has tricke…
First seen on securityweek.com
Jump to article: www.securityweek.com/microsoft-details-skeleton-key-ai-jailbreak-technique/