Weight initialization based-rectified linear unit activation function to improve the performance of a convolutional neural network model.
Bekhzod OlimovKarshiev SanjarEungyeong JangSadia DinAnand PaulJeonghong KimPublished in: Concurr. Comput. Pract. Exp. (2021)
Keyphrases
- neural network model
- activation function
- network architecture
- neural network
- artificial neural networks
- multilayer perceptron
- connection weights
- hidden nodes
- hidden neurons
- sigmoid function
- bp neural network
- multi layer perceptron
- rbf neural network
- back propagation
- feed forward
- hidden layer
- feedforward neural networks
- back propagation neural network
- input variables
- autoregressive
- basis functions
- linear combination
- genetic algorithm