Login / Signup
Multi-armed bandits for adjudicating documents in pooling-based evaluation of information retrieval systems.
David E. Losada
Javier Parapar
Alvaro Barreiro
Published in:
Inf. Process. Manag. (2017)
Keyphrases
</>
multi armed bandits
evaluation of information retrieval systems
retrieval systems
document collections
information retrieval systems
test collection
information retrieval
bandit problems
text documents
document clustering
multi armed bandit
reinforcement learning
digital libraries
xml documents
least squares