Citation
Discussion Paper Details
Please find the details for DP14046 in an easy to copy and paste format below:
Full Details | Bibliographic Reference
Full Details
Title: Undiscounted Bandit Games
Author(s): R Godfrey Keller and Sven Rady
Publication Date: October 2019
Keyword(s): HJB Equation, Markov perfect equilibrium, strategic experimentation, Strong Long-Run Average Criterion, Two-Armed Bandit and Viscosity Solution
Programme Area(s): Industrial Organization
Abstract: We analyze undiscounted continuous-time games of strategic experimentation with two-armed bandits. The risky arm generates payoffs according to a Lévy process with an unknown average payoff per unit of time which nature draws from an arbitrary finite set. Observing all actions and realized payoffs, players use Markov strategies with the common posterior belief about the unknown parameter as the state variable. We show that the unique symmetric Markov perfect equilibrium can be computed in a simple closed form involving only the payoff of the safe arm, the expected current payoff of the risky arm, and the expected full-information payoff, given the current belief. In particular, the equilibrium does not depend on the precise specification of the payoff-generating processes.
For full details and related downloads, please visit: https://cepr.org/active/publications/discussion_papers/dp.php?dpno=14046
Bibliographic Reference
Keller, R and Rady, S. 2019. 'Undiscounted Bandit Games'. London, Centre for Economic Policy Research. https://cepr.org/active/publications/discussion_papers/dp.php?dpno=14046