The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models.
Ulme WennbergGustav Eje HenterPublished in: ACL/IJCNLP (2) (2021)
Keyphrases
- language model
- translation invariant
- language modeling
- n gram
- document retrieval
- statistical language models
- language modelling
- probabilistic model
- wavelet transform
- mathematical morphology
- retrieval model
- multiscale
- query expansion
- information retrieval
- denoising
- language models for information retrieval
- object identification
- test collection
- relevance model
- image processing
- shift invariant
- wavelet packet
- smoothing methods
- image analysis
- pattern recognition