Profiling Vehicles for Improved Small Cell Beam-Vehicle Pairing Using Multi-Armed Bandit
-
Updated
May 9, 2024 - Python
Profiling Vehicles for Improved Small Cell Beam-Vehicle Pairing Using Multi-Armed Bandit
Library for multi-armed bandit selection strategies, including efficient deterministic implementations of Thompson sampling and epsilon-greedy.
Interactive Recommender Systems Framework
Using exp3 and its variations of it to select coins to invest in.
This program deploys Thompson Bandit algorithm to solve an ad prediction for highest probability of clicking.
Music Recommendation system with a Contextual Multi-Armed Bandit
This repository contains hands on code for tutorials on PRICAI 2023 with the topics of Reinforcement Learning for Digital Business
MABSearch: The Bandit Way of Learning the Learning Rate - A Harmony Between Reinforcement Learning and Gradient Descent
Sending personalized marketing offers (called free play in a casino setting) to players by observing data on their gaming behavior and demographic information
Contextual Bandit Engine
how to deal with multi-armed bandit problem through different approaches
This repository contains the code necessary for generating the figures presented in the paper titled "Cooperative Thresholded Lasso for Sparse Linear Bandit".
Python implementations of contextual bandits algorithms
MAB Simulator is a Python package that provides a framework for simulating and comparing multi-armed bandit algorithms.
The iRec official command line interface
This is an A/B testing project that was made to see if a new version of a sign up button in a website is better than current one.
A julia package to compute Gittins Indices for Multi Armed Bandits
Applying anomaly detection methods on Multi-Armed Bandit problems
Some examples of methods for solving multi-armed bandits problems
Library on Multi-armed bandit
Add a description, image, and links to the multiarmed-bandits topic page so that developers can more easily learn about it.
To associate your repository with the multiarmed-bandits topic, visit your repo's landing page and select "manage topics."