MoME: Mixture of Multimodal Experts for Generalist Multimodal Large Language Models.
Leyang ShenGongwei ChenRui ShaoWeili GuanLiqiang NiePublished in: CoRR (2024)
Keyphrases
- language model
- mixture model
- language modeling
- n gram
- speech recognition
- probabilistic model
- multi modal
- document retrieval
- retrieval model
- information retrieval
- statistical language models
- ad hoc information retrieval
- expert finding
- relevance model
- smoothing methods
- language modelling
- test collection
- query expansion
- okapi bm
- query terms
- document ranking
- translation model
- passage retrieval
- audio visual
- vector space model
- question answering
- expectation maximization
- search engine