InfoCL: Alleviating Catastrophic Forgetting in Continual Text Classification from An Information Theoretic Perspective.
Yifan SongPeiyi WangWeimin XiongDawei ZhuTianyu LiuZhifang SuiSujian LiPublished in: EMNLP (Findings) (2023)
Keyphrases
- information theoretic
- text classification
- distributional clustering
- mutual information
- information theory
- feature selection
- theoretic framework
- text categorization
- information bottleneck
- information theoretic measures
- naive bayes
- log likelihood
- bag of words
- jensen shannon divergence
- multi modality
- text mining
- multi label
- labeled data
- minimum description length
- machine learning
- entropy measure
- unlabeled data
- bayesian networks
- kl divergence
- relative entropy
- knowledge discovery
- image registration
- k nearest neighbor