An Empirical Study on Generalizations of the ReLU Activation Function.
Chaity BanerjeeTathagata MukherjeeEduardo Pasiliao Jr.Published in: ACM Southeast Regional Conference (2019)
Keyphrases
- activation function
- neural network
- feed forward
- artificial neural networks
- hidden layer
- feed forward neural networks
- back propagation
- learning rate
- neural nets
- multilayer perceptron
- network architecture
- feedforward neural networks
- chaotic neural network
- network size
- sigmoid function
- hidden nodes
- radial basis function
- basis functions
- multi layer perceptron
- training phase
- autoregressive
- rbf neural network
- input space
- extreme learning machine
- fuzzy neural network
- convergence rate