Login / Signup
Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts.
Rishov Sarkar
Hanxue Liang
Zhiwen Fan
Zhangyang Wang
Cong Hao
Published in:
CoRR (2023)
Keyphrases
</>
memory efficient
multi task
multi task learning
learning tasks
group lasso
multitask learning
multi class
multiple tasks
computer vision
learning problems
gaussian processes
image processing
feature selection
unsupervised learning
high dimensional
feature extraction
sparse learning
feature space