Login / Signup
Hybrid-order distributed SGD: Balancing communication overhead, computational complexity, and convergence rate for distributed learning.
Naeimeh Omidvar
Seyed Mohammad Hosseini
Mohammad Ali Maddah-Ali
Published in:
Neurocomputing (2024)
Keyphrases
</>
distributed learning
communication overhead
convergence rate
computational complexity
communication cost
convergence speed
step size
learning rate
learning environment
nearest neighbor
collaborative learning
multi objective
distributed systems
gradient method
stochastic gradient descent