Discussion Paper Details

Please find the details for DP6983 in an easy to copy and paste format below:

Full Details   |   Bibliographic Reference

Full Details

Title: Negatively Correlated Bandits

Author(s): Nicolas Klein and Sven Rady

Publication Date: October 2008

Keyword(s): Bayesian Learning, Exponential Distribution, Markov Perfect Equilibrium, Poisson Process, Strategic Experimentation and Two-Armed Bandit

Programme Area(s): Industrial Organization

Abstract: We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to decide in continuous time whether to use a safe arm with a known payoff or a risky arm whose likelihood of delivering payoffs is initially unknown. The quality of the risky arms is perfectly negatively correlated between players. In marked contrast to the case where both risky arms are of the same type, we find that learning will be complete in any Markov perfect equilibrium if the stakes exceed a certain threshold, and that all equilibria are in cutoff strategies. For low stakes, the equilibrium is unique, symmetric, and coincides with the planner's solution. For high stakes, the equilibrium is unique, symmetric, and tantamount to myopic behavior. For intermediate stakes, there is a continuum of equilibria.

For full details and related downloads, please visit:

Bibliographic Reference

Klein, N and Rady, S. 2008. 'Negatively Correlated Bandits'. London, Centre for Economic Policy Research.