Convergence Rate Improvement of Richardson and Newton-Schulz Iterations.
Alexander StotskyPublished in: CoRR (2020)
Keyphrases
- convergence rate
- stopping criterion
- primal dual
- step size
- convergence speed
- learning rate
- global convergence
- number of iterations required
- mutation operator
- gradient method
- clustering algorithm
- numerical stability
- least squares
- wavelet neural network
- neural network
- global optimization
- special case
- interior point methods
- significant improvement