Login / Signup

A Proof That Anderson Acceleration Improves the Convergence Rate in Linearly Converging Fixed-Point Methods (But Not in Those Converging Quadratically).

Claire EvansSara N. PollockLeo G. RebholzMengying Xiao
Published in: SIAM J. Numer. Anal. (2020)
Keyphrases
  • fixed point
  • convergence rate
  • sufficient conditions
  • gradient method
  • computer vision
  • graphical models
  • step size
  • learning rate
  • floating point
  • faster convergence rate