A Novel Posit-based Fast Approximation of ELU Activation Function for Deep Neural Networks.
Marco CococcioniFederico RossiEmanuele RuffaldiSergio SaponaraPublished in: SMARTCOMP (2020)
Keyphrases
- activation function
- neural network
- continuous functions
- feed forward
- feed forward neural networks
- back propagation
- hidden layer
- artificial neural networks
- hidden neurons
- neural nets
- multilayer perceptron
- backpropagation algorithm
- learning rate
- radial basis function
- single hidden layer
- fuzzy neural network
- basis functions
- feedforward neural networks
- hidden nodes
- chaotic neural network
- network architecture
- pattern recognition
- single layer
- multi layer
- multi layer perceptron
- extreme learning machine
- fault diagnosis
- small number
- feature space
- genetic algorithm