Login / Signup

Multi-teacher knowledge distillation as an effective method for compressing ensembles of neural networks.

Konrad Zuchniak
Published in: CoRR (2023)
Keyphrases