Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment.
Zhili LiuYunhao GouKai ChenLanqing HongJiahui GaoFei MiYu ZhangZhenguo LiXin JiangQun LiuJames T. KwokPublished in: CoRR (2024)
Keyphrases
- mixture model
- domain experts
- expert advice
- human experts
- blind separation
- mixture distribution
- expectation maximization
- dirichlet processes
- heavy tailed distributions
- gaussian mixture
- source separation
- em algorithm
- mixtures of gaussians
- gaussian density
- heavy tailed
- blind source separation
- subject matter experts
- domain knowledge
- gaussian mixture model
- image alignment
- discrete data
- procrustes analysis
- knowledge engineers
- sequence alignment
- normal distribution
- dynamic time warping
- expert knowledge
- generative model
- knowledge based systems
- knowledge acquisition
- exponential family
- density estimation
- parameter estimation
- maximum likelihood
- image registration
- mixture distributions
- probabilistic model
- face recognition
- image segmentation
- data sets