Convolutions and Self-Attention: Re-interpreting Relative Positions in Pre-trained Language Models.
Tyler A. ChangYifan XuWeijian XuZhuowen TuPublished in: ACL/IJCNLP (1) (2021)
Keyphrases
- language model
- relative position
- pre trained
- language modeling
- speech recognition
- n gram
- probabilistic model
- geometric properties
- information retrieval
- query expansion
- training data
- training examples
- language models for information retrieval
- smoothing methods
- spatial relationships
- semi supervised learning
- knn
- image data
- data analysis
- appearance variations
- decision trees
- feature selection
- data mining