Login / Signup

Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings.

Ta-Chung ChiTing-Han FanLi-Wei ChenAlexander RudnickyPeter J. Ramadge
Published in: ACL (2) (2023)
Keyphrases