Communication-efficient ADMM-based distributed algorithms for sparse training.
Guozheng WangYongmei LeiYongwen QiuLingfei LouYixin LiPublished in: Neurocomputing (2023)
Keyphrases
- computationally expensive
- computationally efficient
- lightweight
- avoid overfitting
- computer networks
- computational cost
- learning algorithm
- significant improvement
- worst case
- information sharing
- communication cost
- computationally intensive
- distributed computation
- fully distributed
- high dimensional
- supervised learning
- distributed systems
- computational complexity
- cooperative
- multi agent
- distributed control
- neural network