Login / Signup

Improving Inference Latency and Energy of DNNs through Wireless Enabled Multi-Chip-Module-based Architectures and Model Parameters Compression.

Giuseppe AsciaVincenzo CataniaAndrea MineoSalvatore MonteleoneMaurizio PalesiDavide Patti
Published in: NOCS (2020)
Keyphrases