Skip to content
#

bandit-algorithms

Here are 82 public repositories matching this topic...

A open source multi arm bandit framework for optimize your website quickly. You’ll quickly use the benefits of several simple algorithms—including the epsilon-Greedy, Softmax, and Upper Confidence Bound (UCB) algorithms—by working through this framework written in Java, which you can easily adapt for deployment on your own website.

  • Updated Feb 17, 2018

This repository contains the implementation of a wide variety of Reinforcement Learning Projects in different applications of Bandit Algorithms, MDPs, Distributed RL and Deep RL. These projects include university projects and projects implemented due to interest in Reinforcement Learning.

  • Updated Feb 18, 2023
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the bandit-algorithms topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the bandit-algorithms topic, visit your repo's landing page and select "manage topics."

Learn more