Login / Signup
Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers (Student Abstract).
Danilo Dordevic
Vukasin Bozic
Joseph Thommes
Daniele Coppola
Sidak Pal Singh
Published in:
AAAI (2024)
Keyphrases
</>
feed forward neural networks
back propagation
multi layer
learning process
neural network
evolutionary computation
feed forward
extreme learning machine