Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning.
Shuo XieJiahao QiuAnkita PasadLi DuQing QuHongyuan MeiPublished in: EMNLP (Findings) (2022)
Keyphrases
- language model
- transfer learning
- hidden state
- reinforcement learning
- language modeling
- knowledge transfer
- hidden markov models
- information retrieval
- n gram
- query expansion
- probabilistic model
- markov models
- retrieval model
- semi supervised learning
- cross domain
- labeled data
- machine learning
- structure learning
- collaborative filtering
- context sensitive
- target domain
- active learning
- text classification
- partially observable
- transfer knowledge
- state space
- dynamical systems
- text mining
- machine learning algorithms
- learning algorithm
- unlabeled data
- dynamic programming
- data analysis