Skip to content

Implementation of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al.

Notifications You must be signed in to change notification settings

rebeccadf/Zeroth-order-optimization-methods

Repository files navigation

Zeroth-order-optimization-methods

This project was submitted as a final project for the course "Optimization for data science" for the Master's degree in Data Science at University of Padova and was realized by me, Rebecca Di Francesco, and my collegues Brenda Eloísa Téllez Juárez and Abhishek Varma Dasaraju. We followed the pseudocode of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al. to implement the following zeroth-order methods: ZO-SGD, ZO-SignSGD and ZO-AdaMM. We applied these gradient-free methods to optimize a non-convex function that was the objective function to minimize in order to generate adversarial attacks. As we know, generating adversarial attacks is a problem that often requires optimization methods that are suitable in the black-box setting. Thus, we tested these methods that are able to approximate first-order gradients on the task of generating per-image adversarial attacks performed towards a CNN classifier which was trained on the CIFAR-10 dataset. Finally, we commented the results by comparing their performances in terms of distortion, attack loss and iteration convergence.

About

Implementation of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published