Implementing Deep Reinforcement Learning Algorithms
-
Updated
Nov 15, 2020 - Jupyter Notebook
Implementing Deep Reinforcement Learning Algorithms
The repository contains codes for RL (e.g., Q-Learning, Monte Carlo, …) in the form of Python files.
Reinforcement learning
A classic reinforcement learning problem.
This repository is focused on my assignments solutions for the Statistical Techniques for Data Science course at Innopolis University.
multi-armed bandit, gambler problem, cliff problem and TD learning
Repository tugas akhir tentang Multi-Armed Bandit
Compared Non-stationary Multi-armed Bandits in Single-Agent to Multi-Agents Scenarios- Distributed Optimization and Learning(DOL) Course Project
Experiments for paper "Online Learning with Costly Features in Non-stationary Environments"
A short conceptual replication of "Prefrontal cortex as a meta-reinforcement learning system" in Jax.
Contextual Multi-Armed Bandit Platform for Scoring, Ranking & Decisions
A Novel Multi-Arm Bandit Optimization Implementation using reinforcement learning in Python for selecting Notifications.
An implementation of solvers for the multi-armed-bandit-problem in JavaScript.
This repository is based on the lecture '고객데이터와 딥러닝을 활용한 추천시스템 구현'
This repository contains an End to End Real time 🕰️ Machine Learning Pipeline to predict star ⭐️ rating of product reviews. This project uses AWS Sagemaker, Kinesis, Lambda, S3, Redshift, Athena, and Step functions. Deployment of multiple models for AB testing and Bandit testing is also included.
The GitHub repository for "Accelerating Approximate Thompson Sampling with Underdamped Langevin Monte Carlo", AISTATS 2024.
A curated list of resources about multi-armed bandit (MAB).
My silver medal winning entry in Kaggle's 2020 Christmas Competition
This is a project to build a multi armed bandit from scratch based on the Kaggle Christmas 2020 Competition.
Add a description, image, and links to the multi-armed-bandit topic page so that developers can more easily learn about it.
To associate your repository with the multi-armed-bandit topic, visit your repo's landing page and select "manage topics."