Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in TF2 PGD implementation #1131

Open
meta-inf opened this issue Nov 13, 2019 · 1 comment
Open

Bug in TF2 PGD implementation #1131

meta-inf opened this issue Nov 13, 2019 · 1 comment

Comments

@meta-inf
Copy link

meta-inf commented Nov 13, 2019

Describe the bug
In future.tf2.attacks.projected_gradient_descent, the sanity check clip_min<= x <= clip_max is implemented incorrectly: here you are appending a tensor to asserts, and a subsequent call of np.all on asserts invoked the __bool__ cast of the tensor, which raises the "The truth value of an array with more than one element is ambiguous" error.

To Reproduce

Use the following code:

import tensorflow as tf
import numpy as np
from cleverhans.future.tf2.attacks import projected_gradient_descent

def model_fn(x):
  return tf.nn.softmax(x, axis=-1)

X = [[1., 2., 3.], [4., 5., 6.]]
print(projected_gradient_descent(
  model_fn, X, 1., 1., 5, np.inf, 0., 7.))

Expected behavior
Code shouldn't crash.

System configuration

  • Ubuntu 18.04.3
  • Python 3.6.0
  • TensorFlow 2.0.0
@CNOCycle
Copy link
Contributor

I have implemented a TF2 version for PGD attack in my fork project. The issues should be fixed.

https://github.com/CNOCycle/cleverhans/tree/feature/tf2.x

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants