Login / Signup

Decoupled graph knowledge distillation: A general logits-based method for learning MLPs on graphs.

Yingjie TianShaokai XuMuyang Li
Published in: Neural Networks (2024)
Keyphrases