a minimal implementation of the random search algorithm for reinforcement learning.
-
Updated
Jun 16, 2019 - Jupyter Notebook
a minimal implementation of the random search algorithm for reinforcement learning.
A pure-MATLAB library of EVolutionary (population-based) OPTimization for Large-Scale black-box continuous Optimization (evopt-lso).
Gradient free reinforcement learning for PyTorch
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
Deep Neural Network Optimization Platform with Gradient-based, Gradient-Free Algorithms
A Julia implementation of Simultaneous Perturbation Stochastic Approximation
Exploring evolutionary protein fitness landscapes
Particle Swarm Optimisation, Genetic Algorithm/Programming for (Gradient-Free) Neural Network Optimisation
Gradient-free online optimization loosely based on Adaptive Moment Estimation (Adam)
Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization
Zeroth order Frank Wolfe algorithm. Project for the Optimization for Data Science exam.
Markov Chain Monte Carlo binary network optimization
fireworks swarm optimization - efficient derivative free solver.
Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
Particle Swarm Optimiser
Snake RL - Reinforcement Learning that solves the Snake game. RL was implemented by Gradient-Free-Optimizers library available for Python, neural networks was created in Keras and game was created in Pygame.
Snake SL - Supervised Learning that solves the Snake game. SL was implemented by Gradient-Free-Optimizers library available for Python, neural networks was created in Keras and game was created in Pygame.
ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning
Gradient Free Reinforcement Learning solving Openai gym LunarLanderV2 by Evolution Strategy (Genetic Algorithm)
ESKit is a portable library written in C, that provides implementations of some self-adaptive evolution strategies
Add a description, image, and links to the gradient-free-optimization topic page so that developers can more easily learn about it.
To associate your repository with the gradient-free-optimization topic, visit your repo's landing page and select "manage topics."