Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

L2 Momentum Iterative Attack, risk of NA outputs #105

Open
qlero opened this issue Apr 7, 2022 · 1 comment
Open

L2 Momentum Iterative Attack, risk of NA outputs #105

qlero opened this issue Apr 7, 2022 · 1 comment

Comments

@qlero
Copy link

qlero commented Apr 7, 2022

Hello,
I'm pushing here an issue I've encountered with the MomentumIterativeAttack object when using the ord=2 argument. Sometimes, the computation will result in perturbed outputs (e.g. a batch of 64 MNIST images) that happen to be a torch Tensor of NA values.

This is due to this step, line 434-437, in iterative_projected_gradient.py:

delta.data *= clamp(
    (self.eps * normalize_by_pnorm(delta.data, p=2) /
        delta.data),
    max=1.)

delta.data may sometimes contain 0-valued elements, which will result in NA through the divisor operation (due to the iterative nature of the algorithm, this will propagate to the whole batch of images).

I'm suggesting to add a small value here, e.g. (self.eps * normalize_by_pnorm(delta.data, p=2) / (delta.data + 1e-8), to avoid this (akin to adding 1 to the vector vv in the function batch_l1_proj_flat in utils.py, line 239).

Best regards,

qlero

@futakw
Copy link

futakw commented Aug 30, 2022

Hi,
I encountered the same error and qlero's solution worked.
Therefore, I also suggest the solution be added

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants