ALE: A Simulation-Based Active Learning Evaluation Framework for the Parameter-Driven Comparison of Query Strategies for NLP.
Philipp KohlNils FreyerYoka KrämerHenri WerthSteffen WolfBodo KraftMatthias MeineckeAlbert ZündorfPublished in: CoRR (2023)
Keyphrases
- evaluation framework
- active learning
- relevance feedback
- evaluation process
- natural language processing
- query processing
- user queries
- evaluation methodology
- query expansion
- learning algorithm
- question answering
- evaluation measures
- information extraction
- support vector
- domain specific
- semantic annotation
- search queries
- evaluation metrics
- keywords
- information retrieval systems
- text classification
- web search
- query terms
- evaluation criteria
- machine learning