PyPop7: A Pure-Python Library for POPulation-based Black-Box Optimization (BBO), especially their *Large-Scale* versions/variants. https://pypop.rtfd.io/
-
Updated
May 13, 2024 - Python
PyPop7: A Pure-Python Library for POPulation-based Black-Box Optimization (BBO), especially their *Large-Scale* versions/variants. https://pypop.rtfd.io/
Square Attack: a query-efficient black-box adversarial attack via random search [ECCV 2020]
Elo ratings for global black box derivative-free optimizers
Powell's Derivative-Free Optimization solvers.
Official implementation for the paper "CoVO-MPC: Theoretical Analysis of Sampling-based MPC and Optimal Covariance Design" accepted by L4DC 2024. CoVO-MPC is an optimal sampling-based MPC algorithm.
This repository contains the PyTorch implementation of Zeroth Order Optimization Based Adversarial Black Box Attack (https://arxiv.org/abs/1708.03999)
Official code for the paper "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark ".
[ICLR'24] "DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training" by Aochuan Chen*, Yimeng Zhang*, Jinghan Jia, James Diffenderfer, Jiancheng Liu, Konstantinos Parasyris, Yihua Zhang, Zheng Zhang, Bhavya Kailkhura, Sijia Liu
Robustify Black-Box Models (ICLR'22 - Spotlight)
Code for IEEE MLSP 2021 paper titled "Model-Free Learning of Optimal Deterministic Resource Allocations in Wireless Systems via Action-Space Exploration"
Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
SCOBO: Sparsity-aware Comparison Oracle Based Optimization
Benchmarking optimization solvers.
Implementation of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al.
Nevergrad Optimizer Benchmarking for 3D Performance Capture
Hard-Thresholding Meets Evolution Strategies in Reinforcement Learning
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
Blockwise Direct Search
Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization
[NeurIPS 2023] “SODA: Robust Training of Test-Time Data Adaptors”
Add a description, image, and links to the zeroth-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the zeroth-order-optimization topic, visit your repo's landing page and select "manage topics."