fix grammatical error in jailbreaking techniques sentence

pull/501/head
Larissa Hubschneider 3 weeks ago
parent a3ff18ff10
commit 66aeb812d5

@ -121,7 +121,7 @@ Check out [this example of a prompt leak](https://twitter.com/simonw/status/1570
## Jailbreaking
Some modern LLMs will avoid responding to unethical instructions provide in a prompt due to the safety policies implemented by the LLM provider. However, it is has been shown that it is still possible to bypass those safety policies and guardrails using different jailbreaking techniques.
Some modern LLMs will avoid responding to unethical instructions provide in a prompt due to the safety policies implemented by the LLM provider. However, it has been shown that it is still possible to bypass those safety policies and guardrails using different jailbreaking techniques.
### Illegal Behavior

Loading…
Cancel
Save