This repository displays the use of Reinforcement Learning, specifically QLearning, REINFORCE, and Actor Critic (A2C) methods to play CartPole-v0 of OpenAI Gym.
-
Updated
Jan 14, 2021 - Python
This repository displays the use of Reinforcement Learning, specifically QLearning, REINFORCE, and Actor Critic (A2C) methods to play CartPole-v0 of OpenAI Gym.
Reinforcement learning framework for implementing custom models on custom environments using state of the art RL algorithms
Implementations of different RL algorithms.
OpenAI Gym game Lunar Lander tested with Deep Q Learning, Policy Gradient and Actor Critic algorithms.
This library is a PlugNPlay version of our novel pipeline VacSIM. We have built a pipeline that makes Reinforcement Learning guided policies for optimal distribution of Covid vaccines.
With underflow, create trafic light clusters that interact together to regulate circulation
Board games with Reinforcement Learning. Peg Solitaire with an Actor Critic agent. NIM, Ledge (aka Gold Rush), and Hex with a Monte Carlo Tree Search agent.
Reinforcement learning agents implemented with PyTorch
Reinforcement learning algorithm implements.
A Python-based repository with implementations of RL algorithms, featuring visualization tools and benchmarks
Several RL-agents are tested on classical environments and benchmarked against their stable-baselines implementation.
Compare efficiency and effectiveness between simple Dense layer and LSTM layer using CartPole
reinforcement learning
This repository showcases the implementation of a PPO Clip first-order method to solve the LunarLander discrete environment
Actor critic methods explored using Pytorch and OpenAI Gym on CartPole and Acrobot environments..
Inverse Reinforcement Learning for Robot Hand Manipulation Task
Implementing some RL algorithms (using PyTorch) on the CartPole environment by OpenAI.
Add a description, image, and links to the actor-critic topic page so that developers can more easily learn about it.
To associate your repository with the actor-critic topic, visit your repo's landing page and select "manage topics."