Researchers Use AI Jailbreak on Top LLMs to Create Chrome Infostealer

New Immersive World LLM jailbreak lets anyone create malware with GenAI. Discover how Cato Networks researchers tricked ChatGPT, Copilot, and DeepSeek into coding infostealers – In this case, a Chrome infostealer.

This article has been indexed from Hackread – Latest Cybersecurity, Tech, AI, Crypto & Hacking News

Read the original article: