Login / Signup
Sequential convergence of AdaGrad algorithm for smooth convex optimization.
Cheik Traoré
Edouard Pauwels
Published in:
CoRR (2020)
Keyphrases
</>
convex optimization
dynamic programming
learning algorithm
computational complexity
iterative algorithms
primal dual
convergence rate
augmented lagrangian
object recognition
np hard
linear programming
convex constraints
constrained optimization
distance metric
cost function
pairwise
high quality
image processing