On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks.
Behnam AsadiHui JiangPublished in: CoRR (2020)
Keyphrases
- activation function
- output layer
- neural network
- hidden layer
- back propagation
- feed forward
- feedforward neural networks
- artificial neural networks
- feed forward neural networks
- radial basis function
- multilayer perceptron
- connection weights
- neural nets
- recurrent neural networks
- learning rate
- multi layer perceptron
- rbf neural network
- fuzzy neural network
- network architecture
- neural network model
- training algorithm
- basis functions
- learning rules
- pattern recognition
- genetic algorithm
- fuzzy logic
- input output
- training process
- fuzzy systems
- training set
- machine learning