On the Convergence of Local Stochastic Compositional Gradient Descent with Momentum.
Hongchang GaoJunyi LiHeng HuangPublished in: ICML (2022)
Keyphrases
- update rule
- learning rate
- stochastic approximation
- convergence rate
- monte carlo
- cost function
- error function
- convergence theorem
- stochastic nature
- convergence speed
- stochastic optimization
- faster convergence
- loss function
- neural network
- iterative algorithms
- learning automata
- conjugate gradient
- objective function
- stochastic context free grammars
- probabilistic model
- pairwise
- stochastic models
- image processing
- computer vision