Distributed Learning and Democratic Embeddings: Polynomial-Time Source Coding Schemes Can Achieve Minimax Lower Bounds for Distributed Gradient Descent under Communication Constraints.
Rajarshi SahaMert PilanciAndrea J. GoldsmithPublished in: CoRR (2021)
Keyphrases
- distributed learning
- coding scheme
- lower bound
- worst case
- lower and upper bounds
- objective function
- collaborative learning
- upper bound
- bitstream
- coding method
- image sequence coding
- knowledge integration
- branch and bound
- image transmission
- distributed learning environments
- min sum
- transform coding
- grid technology
- shape coding
- vc dimension
- np hard
- computational complexity
- optimal solution
- e learning