• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Improving Inference Latency and Energy of DNNs through Wireless Enabled Multi-Chip-Module-based Architectures and Model Parameters Compression.

Giuseppe AsciaVincenzo CataniaAndrea MineoSalvatore MonteleoneMaurizio PalesiDavide Patti
Published in: NOCS (2020)
Keyphrases