First public version
I finally open-sourced my research framework on multi-armed bandits 馃帀
https://github.com/Naereen/AlgoBandits
Please keep in mind that this is only meant as a research framework:
easy to interact with, easy to modify, and easy to do some small or
medium-sized simulations and get nice figures for research paper.
It is not meant as an industry package for multi-armed bandits. If you
want to use any MAB algorithms for real-world content optimization, you
should rather implement them yourself to better suit your needs.
With that being said, I am very excited to finally share this on GitHub.
If you have any suggestion on how I could improve this project, I would
be delighted to here them! Contributions like issues, pull requests,
questions etc are welcome.