Gradient Descent Learns One-hidden-layer CNN: Don't be Afraid of Spurious Local Minima.
Simon S. DuJason D. LeeYuandong TianBarnabás PóczosAarti SinghPublished in: CoRR (2017)
Keyphrases
- hidden layer
- back propagation
- learning rate
- error function
- activation function
- feed forward
- neural network
- recurrent neural networks
- artificial neural networks
- feedforward neural networks
- cellular neural networks
- radial basis function
- multilayer perceptron
- neural nets
- rbf neural network
- feed forward neural networks
- cost function
- latent variables
- training algorithm
- simulated annealing
- global minimum
- output layer
- learning rules
- number of hidden layers
- fuzzy neural network
- pattern recognition
- objective function
- learning algorithm
- artificial intelligence
- hidden nodes
- fuzzy logic
- feature extraction
- genetic algorithm