Task-Attentive Transformer Architecture for Continual Learning of Vision-and-Language Tasks Using Knowledge Distillation.
Yuliang CaiJesse ThomasonMohammad RostamiPublished in: CoRR (2023)
Keyphrases
- learning systems
- learning mechanisms
- knowledge acquisition
- prior knowledge
- learning algorithm
- multiple tasks
- knowledge transfer
- learned knowledge
- management system
- background knowledge
- language acquisition
- learning process
- reinforcement learning
- real time
- neural network
- learning capabilities
- previously learned
- knowledge level
- domain theory
- concept maps
- learning tasks
- programming language
- active learning
- knowledge base
- design principles
- human experts
- object oriented programming
- online learning
- complex domains
- supervised learning
- machine learning
- transferring knowledge