Do You Even Need Attention? A Stack of Feed-Forward Layers Does Surprisingly Well on ImageNet.
Luke Melas-KyriaziPublished in: CoRR (2021)
Keyphrases
- feed forward
- neural nets
- back propagation
- neural network
- artificial neural networks
- biologically plausible
- artificial neural
- recurrent neural networks
- hidden layer
- visual cortex
- activation function
- feed forward neural networks
- image collections
- multi layer
- neural architecture
- spiking neurons
- error back propagation
- neuron model
- spiking neural networks
- recurrent networks
- fault diagnosis