Login / Signup
Distillation is All You Need for Practically Using Different Pre-trained Recommendation Models.
Wenqi Sun
Ruobing Xie
Junjie Zhang
Wayne Xin Zhao
Leyu Lin
Ji-Rong Wen
Published in:
CoRR (2024)
Keyphrases
</>
pre trained
probabilistic model
recommender systems
prior knowledge