What Do Position Embeddings Learn? An Empirical Study of Pre-Trained Language Model Positional Encoding.
Yu-An WangYun-Nung ChenPublished in: EMNLP (1) (2020)
Keyphrases
- language model
- pre trained
- language modeling
- n gram
- training data
- probabilistic model
- document retrieval
- information retrieval
- query expansion
- speech recognition
- context sensitive
- training examples
- smoothing methods
- control signals
- ad hoc information retrieval
- mixture model
- test collection
- retrieval model
- translation model
- generative model
- low dimensional
- bayesian networks