Login / Signup
Jailbreaking Prompt Attack: A Controllable Adversarial Attack against Diffusion Models.
Jiachen Ma
Anda Cao
Zhiqing Xiao
Jie Zhang
Chao Ye
Junbo Zhao
Published in:
CoRR (2024)
Keyphrases
</>
diffusion models
level set
random walk
diffusion model