Login / Signup

Distillation is All You Need for Practically Using Different Pre-trained Recommendation Models.

Wenqi SunRuobing XieJunjie ZhangWayne Xin ZhaoLeyu LinJi-Rong Wen
Published in: CoRR (2024)
Keyphrases
  • pre trained
  • probabilistic model
  • recommender systems
  • prior knowledge