• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-Level Sparsity via Mixture-of-Experts.

Rishov SarkarHanxue LiangZhiwen FanZhangyang WangCong Hao
Published in: ICCAD (2023)
Keyphrases