george-jiexiong / multi-armed-bandit Goto Github PK
View Code? Open in Web Editor NEWThis project forked from abdullahkhan93/multi-armed-bandit
A basic implementation of techniques to solve the Multi-Armed bandit (MAB) problem from the context of a marketing strategy. A couple of techniques namely the Epsilon-Greedy Approach, Upper Confidence Bound (UCB), Gradient Ascent and Thompson Sampling have been used to analyze choosing the best website in terms of receiving a click.