Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[question] change image size #14

Open
jS5t3r opened this issue Nov 30, 2023 · 4 comments
Open

[question] change image size #14

jS5t3r opened this issue Nov 30, 2023 · 4 comments

Comments

@jS5t3r
Copy link

jS5t3r commented Nov 30, 2023

I want the generated attacks to have an image size of 256x256.

For that, I thought to change the param res to 256

python main.py --model_name resnet18 --save_dir $save_dir --images_root $image_root --label_path $label_path --res 256  #224 

An error shows up then

  File "main.py", line 164, in <module>
    adv_image, clean_acc, adv_acc = run_diffusion_attack(tmp_image, label[ind:ind + 1],
  File "main.py", line 70, in run_diffusion_attack
    adv_image, clean_acc, adv_acc = diff_latent_attack.diffattack(diffusion_model, label, controller,
  File "/home/user/.conda/envs/DiffPurification/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "/home/user/DiffAttack/diff_latent_attack.py", line 374, in diffattack
    before_attention_map = aggregate_attention(prompt, controller, 7, ("up", "down"), True, 0, is_cpu=False)
  File "/home/user/DiffAttack/utils.py", line 17, in aggregate_attention
    out = torch.cat(out, dim=0)
RuntimeError: torch.cat(): expected a non-empty list of Tensors

Besides that, I assume that some parameters must be optimized so that the attack is strong on different parameters. Which parameters are important?

@WindVChen
Copy link
Owner

Hi @jS5t3r ,

Thank you for pointing this out. I've just updated the code, and I'm optimistic that this update resolves the resolution parameter setting issue.

Regarding parameter optimization, I suggest focusing on the three key loss weights: --attack_loss_weight, --cross_attn_loss_weight, and --self_attn_loss_weight. By varying these values, you can fine-tune the attack for either more imperceptibility or increased transferability. Additionally, consider adjusting parameters such as --iterations, --diffusion_steps, and --start_step for further customization.

@youyuanyi
Copy link

Thanks, where u change that can solve the resolution issues?

Hi @jS5t3r ,

Thank you for pointing this out. I've just updated the code, and I'm optimistic that this update resolves the resolution parameter setting issue.

Regarding parameter optimization, I suggest focusing on the three key loss weights: --attack_loss_weight, --cross_attn_loss_weight, and --self_attn_loss_weight. By varying these values, you can fine-tune the attack for either more imperceptibility or increased transferability. Additionally, consider adjusting parameters such as --iterations, --diffusion_steps, and --start_step for further customization.

@WindVChen
Copy link
Owner

Hi @youyuanyi ,

You can refer to the commit here for the details of the changes.

Hope this can help.

@youyuanyi
Copy link

Thank you very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants