RKLD: Reverse KL-Divergence-based Knowledge Distillation for Unlearning Personal Information in Large Language Models.
Bichen WangYuzhe ZiYixin SunYanyan ZhaoBing QinPublished in: CoRR (2024)
Keyphrases
- language model
- personal information
- kl divergence
- language modeling
- translation model
- document retrieval
- n gram
- information retrieval
- probabilistic model
- social networking
- third party
- retrieval model
- query expansion
- mahalanobis distance
- mixture model
- knowledge discovery
- query terms
- test collection
- prior knowledge
- social networks
- information theoretic
- topic models
- question answering
- context sensitive
- data mining techniques
- vector space model
- kullback leibler divergence
- data streams
- similarity measure