s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning.
Hangbo BaoLi DongWenhui WangNan YangFuru WeiPublished in: CoRR (2021)
Keyphrases
- fine tuning
- learning algorithm
- learning systems
- hidden state
- learning tasks
- active learning
- online learning
- learning process
- multi agent
- data mining
- sequence classification
- similarity measure
- sequence alignment
- video compression
- learning problems
- reinforcement learning
- mobile learning
- background knowledge
- unsupervised learning
- empirical studies
- supervised learning
- semi supervised
- fuzzy logic
- video sequences
- prior knowledge