Numerical optimization via mollifier smoothing
-
Updated
Mar 7, 2024 - Python
Numerical optimization via mollifier smoothing
Snake SL - Supervised Learning that solves the Snake game. SL was implemented by Gradient-Free-Optimizers library available for Python, neural networks was created in Keras and game was created in Pygame.
fireworks swarm optimization - efficient derivative free solver.
Snake RL - Reinforcement Learning that solves the Snake game. RL was implemented by Gradient-Free-Optimizers library available for Python, neural networks was created in Keras and game was created in Pygame.
ESKit is a portable library written in C, that provides implementations of some self-adaptive evolution strategies
Implementation code for the paper "Bayesian Optimization via Exact Penalty"
a minimal implementation of the random search algorithm for reinforcement learning.
Gradient-free online optimization loosely based on Adaptive Moment Estimation (Adam)
Particle Swarm Optimisation, Genetic Algorithm/Programming for (Gradient-Free) Neural Network Optimisation
PRIMA: Reference Implementation for Powell's methods with Modernization and Amelioration
0th order optimizers, gradient chaining, random gradient approximation
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
Implementation of smoothing-based optimization algorithms
A collection and visualization of single objective black-box functions for optimization benchmarking.
A Julia implementation of Simultaneous Perturbation Stochastic Approximation
Exploring evolutionary protein fitness landscapes
Black-box adversarial attacks on deep neural networks with tensor train (TT) decomposition and PROTES optimizer.
Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization
A pure-MATLAB library of EVolutionary (population-based) OPTimization for Large-Scale black-box continuous Optimization (evopt-lso).
Particle Swarm Optimiser
Add a description, image, and links to the gradient-free-optimization topic page so that developers can more easily learn about it.
To associate your repository with the gradient-free-optimization topic, visit your repo's landing page and select "manage topics."