Toward Moderate Overparameterization: Global Convergence Guarantees for Training Shallow Neural Networks.
Samet OymakMahdi SoltanolkotabiPublished in: IEEE J. Sel. Areas Inf. Theory (2020)
Keyphrases
- global convergence
- neural network
- training process
- training algorithm
- optimization methods
- convergence analysis
- convergence rate
- global optimum
- convergence speed
- constrained optimization problems
- feedforward neural networks
- convex minimization
- multi layer perceptron
- feed forward neural networks
- back propagation
- artificial neural networks
- genetic algorithm
- multilayer perceptron
- globally convergent
- coordinate ascent
- question answering
- training set
- natural language processing
- supervised learning
- fuzzy logic
- neural network training
- neural network model
- global optimization
- evolutionary computation
- support vector machine
- machine learning