Login / Signup

What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding.

Hongkang LiMeng WangTengfei MaSijia LiuZaixi ZhangPin-Yu Chen
Published in: CoRR (2024)
Keyphrases