A Crowdsourcing Approach to Evaluate the Quality of Query-based Extractive Text Summaries.
Neslihan IskenderAleksandra GabryszakTim PolzehlLeonhard HennigSebastian MöllerPublished in: QoMEX (2019)
Keyphrases
- text summarization
- extractive summarization
- database
- keywords
- summary generation
- document summaries
- data structure
- automatic text summarization
- topic segmentation
- query expansion
- automatic summarization
- user queries
- multi document summarization
- query biased
- relevance feedback
- related documents
- text databases
- information retrieval
- natural language processing
- multimedia documents
- high quality
- query processing
- data sources
- range queries
- approximate answers
- query evaluation
- multidocument summarization
- text mining
- retrieval engine
- object retrieval
- semantically related
- user interaction
- response time
- document collections
- related words
- video search
- text retrieval