InfoCL: Alleviating Catastrophic Forgetting in Continual Text Classification from An Information Theoretic Perspective.
Yifan SongPeiyi WangWeimin XiongDawei ZhuTianyu LiuZhifang SuiSujian LiPublished in: CoRR (2023)
Keyphrases
- information theoretic
- text classification
- distributional clustering
- mutual information
- information theory
- theoretic framework
- feature selection
- bag of words
- information theoretic measures
- text categorization
- information bottleneck
- relative entropy
- kullback leibler divergence
- naive bayes
- text mining
- machine learning
- knn
- multi modality
- log likelihood
- computational learning theory
- minimum description length
- entropy measure
- jensen shannon divergence
- multi label
- rough sets