Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks.
Kenta InouePublished in: CANDAR Workshops (2019)
Keyphrases
- hidden layer
- neural network
- back propagation
- activation function
- feed forward
- multilayer perceptron
- feed forward neural networks
- feedforward neural networks
- recurrent neural networks
- artificial neural networks
- neural nets
- radial basis function
- learning rate
- connection weights
- training algorithm
- backpropagation algorithm
- neural network structure
- error back propagation
- rbf neural network
- hidden neurons
- single hidden layer
- fuzzy logic
- rbfnn
- radial basis function neural network
- fuzzy neural network
- neural network model
- number of hidden units
- output layer
- number of hidden layers
- single layer
- hidden units
- multi layer perceptron
- bp neural network
- latent variables
- hidden nodes
- learning rules
- network architecture
- pattern recognition
- learning algorithm