Crowdsourcing versus the laboratory: Towards crowd-based linguistic text quality assessment of query-based extractive summarization.
Neslihan IskenderTim PolzehlSebastian MöllerPublished in: Qurator (2020)
Keyphrases
- quality assessment
- extractive summarization
- text summarization
- crowd sourced
- image and video processing
- image quality
- database
- keywords
- video quality
- relevance feedback
- query expansion
- quality metrics
- human visual system
- image quality assessment
- data quality
- information retrieval
- natural language processing
- post treatment
- visual quality
- high quality
- conditional random fields
- bit rate
- unsupervised learning
- knowledge discovery
- reduced reference
- uncertainty handling
- machine learning