Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance.
Hongjian WangMert GürbüzbalabanLingjiong ZhuUmut SimsekliMurat A. ErdogduPublished in: NeurIPS (2021)
Keyphrases
- stochastic gradient descent
- convergence rate
- step size
- regularization parameter
- number of iterations required
- convergence speed
- learning rate
- cross validation
- image restoration
- noisy images
- least squares
- image processing
- noise level
- hyperparameters
- machine learning
- training samples
- loss function
- feature selection
- learning algorithm