Exponential Convergence Time of Gradient Descent for One-Dimensional Deep Linear Neural Networks.
Ohad ShamirPublished in: CoRR (2018)
Keyphrases
- neural network
- linear complexity
- pattern recognition
- operator splitting
- update rule
- objective function
- cost function
- fuzzy logic
- multi dimensional
- loss function
- weight update
- convex optimization
- genetic algorithm
- learning rules
- convergence rate
- back propagation
- expert systems
- closed form
- multilayer perceptron
- self organizing maps
- fuzzy systems
- fault diagnosis
- linear constraints
- deep learning
- artificial neural networks
- gauss seidel
- support vector