Login / Signup
How Much Self-Attention Do We Needƒ Trading Attention for Feed-Forward Layers.
Kazuki Irie
Alexander Gerstenberger
Ralf Schlüter
Hermann Ney
Published in:
ICASSP (2020)
Keyphrases
</>
feed forward
neural network
back propagation
artificial neural networks
data sets
focus of attention