Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation.
Elisa OostwalMichiel StraatMichael BiehlPublished in: CoRR (2019)
Keyphrases
- neural network
- activation function
- feed forward neural networks
- pattern recognition
- feedforward neural networks
- neural network model
- artificial neural networks
- back propagation
- information processing
- feed forward
- training algorithm
- hidden information
- fully connected
- fault diagnosis
- self organizing maps
- multiple layers
- multilayer perceptron
- recurrent neural networks
- neural nets
- website
- neural network is trained
- competitive learning
- vc dimension
- radial basis function
- genetic algorithm