Prompt Injection
An adversarial attack where malicious input manipulates a language model into performing unintended actions.
learn more?
Subscribe and we'll send new content to your inbox.
An adversarial attack where malicious input manipulates a language model into performing unintended actions.
Subscribe and we'll send new content to your inbox.