Towards anytime active learning: interrupting experts to reduce annotation costs.
Maria Eugenia Ramirez-LoaizaAron CulottaMustafa BilgicPublished in: IDEA@KDD (2013)
Keyphrases
- active learning
- annotation effort
- unlabeled data
- cost sensitive learning
- machine learning
- semi supervised
- random sampling
- training set
- experimental design
- supervised learning
- learning strategies
- labeled data
- relevance feedback
- learning algorithm
- domain experts
- human experts
- cost savings
- pool based active learning
- data mining
- cost sensitive
- domain specific
- total cost
- learning process
- annotation tool
- active learning strategies