Login / Signup
Gradient Descent Is Optimal Under Lower Restricted Secant Inequality And Upper Error Bound.
Charles Guille-Escuret
Adam Ibrahim
Baptiste Goujaud
Ioannis Mitliagkas
Published in:
NeurIPS (2022)
Keyphrases
</>
error bounds
worst case
theoretical analysis
slightly higher
wavelet synopses
optimal solution
finite sample
cost function
rademacher complexity
high dimensional
np hard
mutual information
polynomial time approximation