Login / Signup

Exploration. Exploitation, and Engagement in Multi-Armed Bandits with Abandonment.

Zixian YangXin LiuLei Ying
Published in: Allerton (2022)
Keyphrases
  • bandit problems
  • exploration exploitation
  • multi armed bandits
  • decision problems
  • active learning
  • computational complexity
  • steady state