ScaleCom: Scalable Sparsified Gradient Compression for Communication-Efficient Distributed Training.
Chia-Yu ChenJiamin NiSongtao LuXiaodong CuiPin-Yu ChenXiao SunNaigang WangSwagath VenkataramaniVijayalakshmi SrinivasanWei ZhangKailash GopalakrishnanPublished in: NeurIPS (2020)
Keyphrases
- lightweight
- communication overhead
- fully distributed
- multi agent
- distributed systems
- communication cost
- scalable distributed
- spatially distributed
- computer networks
- communication systems
- distributed computation
- multi party
- distributed environment
- peer to peer
- data sets
- training phase
- compression scheme
- data intensive
- highly scalable
- communication networks
- load balance
- image compression
- edge detection
- weighted sums