Towards a Reliable and Robust Methodology for Crowd-Based Subjective Quality Assessment of Query-Based Extractive Text Summarization.
Neslihan IskenderTim PolzehlSebastian MöllerPublished in: LREC (2020)
Keyphrases
- text summarization
- quality assessment
- query expansion
- objective measures
- quality metrics
- subjective quality
- natural language processing
- multi document summarization
- named entity recognition
- video quality assessment
- question answering
- automatic summarization
- reduced reference
- information extraction
- image quality
- image quality assessment
- lexical chains
- extractive summarization
- human visual system
- video quality
- relevance feedback
- information retrieval systems
- image quality metrics
- visual quality
- automatic evaluation
- data sources
- information retrieval
- data quality
- databases
- markov random field
- high quality
- image processing