Login / Signup
SEER-MoE: Sparse Expert Efficiency through Regularization for Mixture-of-Experts.
Alexandre Muzio
Alex Sun
Churan He
Published in:
CoRR (2024)
Keyphrases
</>
domain experts
human experts
expert advice
sparsity regularization
mixture model
elastic net
mixed norm
sparse data
compressed sensing
compressive sensing
structured sparsity
rank minimization
sparse approximation
neural network
gaussian distribution
prior information
image reconstruction
model selection