MING-MOE: Enhancing Medical Multi-Task Learning in Large Language Models with Sparse Mixture of Low-Rank Adapter Experts.
Yusheng LiaoShuyang JiangYu WangYanfeng WangPublished in: CoRR (2024)
Keyphrases
- language model
- low rank
- multi task learning
- high order
- mixture model
- multi task
- missing data
- matrix factorization
- probabilistic model
- convex optimization
- singular value decomposition
- high dimensional data
- information retrieval
- higher order
- semi supervised
- gaussian processes
- linear combination
- learning tasks
- transfer learning
- expectation maximization
- high dimensional
- pairwise
- learning problems
- markov random field
- denoising
- supervised learning
- multi class
- learning models