Login / Signup

Jailbreaking Prompt Attack: A Controllable Adversarial Attack against Diffusion Models.

Jiachen MaAnda CaoZhiqing XiaoJie ZhangChao YeJunbo Zhao
Published in: CoRR (2024)
Keyphrases
  • diffusion models
  • level set
  • random walk
  • diffusion model