Login / Signup
Gossip Distillation: Decentralized Deep Learning Transmitting Neither Training Data Nor Models.
Taisuke Moriwaki
Kazuyuki Shudo
Published in:
VCC (2023)
Keyphrases
</>
deep learning
training data
data sets
probabilistic model
learning algorithm
prior knowledge
computer vision
overlay network