Simple Reinforcement learning tutorials, 莫烦Python 中文AI教学
-
Updated
Mar 31, 2024 - Python
Simple Reinforcement learning tutorials, 莫烦Python 中文AI教学
强化学习中文教程(蘑菇书🍄),在线阅读地址:https://datawhalechina.github.io/easy-rl/
Contains high quality implementations of Deep Reinforcement Learning algorithms written in PyTorch
Master Reinforcement and Deep Reinforcement Learning using OpenAI Gym and TensorFlow
Reinforcement learning tutorials
🐋 Simple implementations of various popular Deep Reinforcement Learning algorithms using TensorFlow2
This repository contains most of pytorch implementation based classic deep reinforcement learning algorithms, including - DQN, DDQN, Dueling Network, DDPG, SAC, A2C, PPO, TRPO. (More algorithms are still in progress)
Clean, Robust, and Unified PyTorch implementation of popular DRL Algorithms (Q-learning, Duel DDQN, PER, C51, Noisy DQN, PPO, DDPG, TD3, SAC, ASL)
Repository for codes of 'Deep Reinforcement Learning'
Reinforcement Learning | tensorflow implementation of DQN, Dueling DQN and Double DQN performed on Atari Breakout
The implement of all kinds of dqn reinforcement learning with Pytorch
Basic reinforcement learning algorithms. Including:DQN,Double DQN, Dueling DQN, SARSA, REINFORCE, baseline-REINFORCE, Actor-Critic,DDPG,DDPG for discrete action space, A2C, A3C, TD3, SAC, TRPO
DQN, DDDQN, A3C, PPO, Curiosity applied to the game DOOM
🍄Reinforcement Learning: Super Mario Bros with dueling dqn🍄
This is an implementation of Deep Q Learning (DQN) playing Breakout from OpenAI's gym with Keras.
Paddle-RLBooks is a reinforcement learning code study guide based on pure PaddlePaddle.
An implementation of (Double/Dueling) Deep-Q Learning to play Super Mario Bros.
Pytorch implementation of distributed deep reinforcement learning
OpenAI LunarLander-v2 DeepRL-based solutions (DQN, DuelingDQN, D3QN)
Deep Learning Project
Add a description, image, and links to the dueling-dqn topic page so that developers can more easily learn about it.
To associate your repository with the dueling-dqn topic, visit your repo's landing page and select "manage topics."