A new Gradient TD Algorithm with only One Step-size: Convergence Rate Analysis using L-λ Smoothness.
Hengshuai YaoPublished in: CoRR (2023)
Keyphrases
- convergence rate
- step size
- gradient method
- variable step size
- cost function
- steepest descent method
- convergence speed
- learning rate
- faster convergence
- temporal difference
- global convergence
- learning algorithm
- primal dual
- evolutionary programming
- search space
- mutation operator
- conjugate gradient
- line search
- global optimum
- stochastic gradient descent
- computational complexity
- levenberg marquardt
- numerical stability
- faster convergence rate
- negative matrix factorization
- premature convergence
- particle swarm optimization
- genetic programming
- optimization problems