Gradient Descent on Infinitely Wide Neural Networks: Global Convergence and Generalization.
Francis R. BachLenaïc ChizatPublished in: CoRR (2021)
Keyphrases
- global convergence
- neural network
- global optimum
- convergence rate
- convergence analysis
- convergence speed
- optimization methods
- constrained optimization problems
- objective function
- convex minimization
- line search
- coordinate ascent
- back propagation
- cost function
- artificial neural networks
- conjugate gradient
- finite number
- learning rules
- training algorithm
- fuzzy logic
- globally convergent
- linear combination
- gauss newton
- simulated annealing
- particle swarm
- genetic algorithm
- optimization method
- optimal solution
- step size
- convex optimization
- differential evolution
- sensitivity analysis