Implementing Deep Reinforcement Learning Algorithms
-
Updated
Nov 15, 2020 - Jupyter Notebook
Implementing Deep Reinforcement Learning Algorithms
The repository contains codes for RL (e.g., Q-Learning, Monte Carlo, …) in the form of Python files.
Reinforcement learning
Repository tugas akhir tentang Multi-Armed Bandit
This repository contains an End to End Real time 🕰️ Machine Learning Pipeline to predict star ⭐️ rating of product reviews. This project uses AWS Sagemaker, Kinesis, Lambda, S3, Redshift, Athena, and Step functions. Deployment of multiple models for AB testing and Bandit testing is also included.
The Multi-armed bandit problem is one of the classical reinforcements learning problems that describe the friction between the agent's exploration and exploitation.
My silver medal winning entry in Kaggle's 2020 Christmas Competition
This is a project to build a multi armed bandit from scratch based on the Kaggle Christmas 2020 Competition.
Reinforcement learning techniques applied to solve pricing problems in e-commerce applications. Final project for "Online learning applications" course (2021-2022)
🦾🤖 Visual and interactive simulator of multi-armed bandit problem.
Compared Non-stationary Multi-armed Bandits in Single-Agent to Multi-Agents Scenarios- Distributed Optimization and Learning(DOL) Course Project
A Novel Multi-Arm Bandit Optimization Implementation using reinforcement learning in Python for selecting Notifications.
In probability theory, the multi-armed bandit problem is a problem in which a fixed limited set of resources must be allocated between competing (alternative) choices in a way that maximizes their expected gain, when each choice's properties are only partially known at the time of allocation, and may become better understood as time passes or by…
Prof. Jungmin So - spring '23
Experiments for paper "Bayesian Linear Bandits for Large-Scale Recommender Systems"
A simple multi-armed bandit for Go.
Modeling a 1-armed bandit with pystan.
Add a description, image, and links to the multi-armed-bandit topic page so that developers can more easily learn about it.
To associate your repository with the multi-armed-bandit topic, visit your repo's landing page and select "manage topics."