Login / Signup

Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts.

Rishov SarkarHanxue LiangZhiwen FanZhangyang WangCong Hao
Published in: CoRR (2023)
Keyphrases