Fine-tuning BERT for Low-Resource Natural Language Understanding via Active Learning.
Daniel GrießhaberJohannes MaucherNgoc Thang VuPublished in: CoRR (2020)
Keyphrases
- user model
- natural language understanding
- fine tuning
- dialogue system
- active learning
- text understanding
- semantic analysis
- language understanding
- viable alternative
- domain independent
- natural language
- semantic representations
- knowledge representation
- natural language processing
- abductive reasoning
- fine tune
- semi supervised
- fine tuned
- spoken dialog systems
- random sampling
- joint inference
- multi agent
- learning algorithm
- learning process
- machine learning
- chinese word segmentation
- lexical knowledge
- database
- resource allocation
- training set
- data mining