Login / Signup
Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization.
Tan Minh Nguyen
Richard G. Baraniuk
Robert M. Kirby
Stanley J. Osher
Bao Wang
Published in:
MSML (2022)
Keyphrases
</>
fuzzy logic
learning rate
morphological operators
focus of attention
real time
social networks
visual attention
distribution network
e learning
three dimensional
multiscale
object recognition
relational databases
multiresolution
fault diagnosis
power transformers