Login / Signup
Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-Level Sparsity via Mixture-of-Experts.
Rishov Sarkar
Hanxue Liang
Zhiwen Fan
Zhangyang Wang
Cong Hao
Published in:
ICCAD (2023)
Keyphrases
</>
memory efficient
multi task
multi task learning
learning tasks
group lasso
multiple tasks
multitask learning
computer vision
transfer learning
learning problems
sparse learning
sparse representation
high dimensional
decision trees
machine learning
high order
gaussian processes
active learning
data mining