Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning.
Shuo XieJiahao QiuAnkita PasadLi DuQing QuHongyuan MeiPublished in: CoRR (2022)
Keyphrases
- transfer learning
- language model
- hidden state
- reinforcement learning
- language modeling
- n gram
- markov models
- knowledge transfer
- information retrieval
- probabilistic model
- hidden markov models
- text classification
- cross domain
- query expansion
- labeled data
- retrieval model
- machine learning
- transfer knowledge
- learning algorithm
- machine learning algorithms
- dynamical systems
- context sensitive
- dynamic programming
- structure learning
- cross lingual
- active learning
- text categorization
- markov decision processes
- semi supervised learning
- text mining
- collaborative filtering
- target domain