Skip to content

babaniyi/bandits

Repository files navigation

Multiarmed bandits

I introduced various multiarmed bandits algorithms such as e-greedy, annealing epsilon greedy, thompson sampling, UCB etc. I also compared the performance of these algorithms and how they can quickly find the best arm.

About

Algorithms for multiarmed bandits such as e-greedy, thompson sampling, etc.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages