Fine-tuning BERT for Low-Resource Natural Language Understanding via Active Learning.
Daniel GrießhaberJohannes MaucherNgoc Thang VuPublished in: COLING (2020)
Keyphrases
- natural language understanding
- fine tuning
- active learning
- text understanding
- natural language
- knowledge representation
- semantic analysis
- language understanding
- viable alternative
- natural language processing
- fine tune
- semantic representations
- semi supervised
- resource allocation
- abductive reasoning
- spoken dialog systems
- fine tuned
- dialogue system
- training set
- learning algorithm
- joint inference
- random sampling
- prior knowledge
- learning process
- high level
- machine learning