A working POC of a GPT-5 jailbreak via PROMISQROUTE (Prompt-based Router Open-Mode Manipulation) with a barebones C2 server & agent generation demo.
-
Updated
Sep 21, 2025 - C
A working POC of a GPT-5 jailbreak via PROMISQROUTE (Prompt-based Router Open-Mode Manipulation) with a barebones C2 server & agent generation demo.
A comprehensive Red Teaming framework for testing Large Language Model (LLM) robustness against adversarial prompt engineering and jailbreak vectors.
Add a description, image, and links to the gpt5-jailbreak-working topic page so that developers can more easily learn about it.
To associate your repository with the gpt5-jailbreak-working topic, visit your repo's landing page and select "manage topics."