A pure-MATLAB library of EVolutionary (population-based) OPTimization for Large-Scale black-box continuous Optimization (evopt-lso).
-
Updated
Aug 17, 2019 - MATLAB
A pure-MATLAB library of EVolutionary (population-based) OPTimization for Large-Scale black-box continuous Optimization (evopt-lso).
Numerical optimization via mollifier smoothing
Snake SL - Supervised Learning that solves the Snake game. SL was implemented by Gradient-Free-Optimizers library available for Python, neural networks was created in Keras and game was created in Pygame.
Gradient Free Reinforcement Learning solving Openai gym LunarLanderV2 by Evolution Strategy (Genetic Algorithm)
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
fireworks swarm optimization - efficient derivative free solver.
🥭 MANGO: Maximization of neural Activation via Non-Gradient Optimization
Implementation code for the paper "Bayesian Optimization via Exact Penalty"
Snake RL - Reinforcement Learning that solves the Snake game. RL was implemented by Gradient-Free-Optimizers library available for Python, neural networks was created in Keras and game was created in Pygame.
ESKit is a portable library written in C, that provides implementations of some self-adaptive evolution strategies
a minimal implementation of the random search algorithm for reinforcement learning.
Gradient free reinforcement learning for PyTorch
A collection and visualization of single objective black-box functions for optimization benchmarking.
Gradient-free online optimization loosely based on Adaptive Moment Estimation (Adam)
Implementation of smoothing-based optimization algorithms
A Julia implementation of Simultaneous Perturbation Stochastic Approximation
Particle Swarm Optimiser
Exploring evolutionary protein fitness landscapes
Zeroth order Frank Wolfe algorithm. Project for the Optimization for Data Science exam.
Black-box adversarial attacks on deep neural networks with tensor train (TT) decomposition and PROTES optimizer.
Add a description, image, and links to the gradient-free-optimization topic page so that developers can more easily learn about it.
To associate your repository with the gradient-free-optimization topic, visit your repo's landing page and select "manage topics."