Login / Signup
MLP-Attention: Improving Transformer Architecture with MLP Attention Weights.
Alireza Morsali
Moein Heidari
Samin Heydarian
Tohid Abedini
Published in:
Tiny Papers @ ICLR (2023)
Keyphrases
</>
multilayer perceptron
multi layer perceptron
neural network
visual attention
database
management system
layer perceptron
case study
multi agent
expert systems
fuzzy logic
vision system
hidden layer
hidden nodes