Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings.
Ta-Chung ChiTing-Han FanLi-Wei ChenAlexander I. RudnickyPeter J. RamadgePublished in: CoRR (2023)
Keyphrases
- positional information
- language model
- language modeling
- n gram
- document retrieval
- retrieval model
- probabilistic model
- information retrieval
- speech recognition
- test collection
- language modelling
- query expansion
- context sensitive
- vector space model
- vector space
- statistical language models
- position information
- language models for information retrieval
- relevance model
- planar surfaces
- document ranking
- low dimensional
- topic models
- bayesian networks
- search engine