Login / Signup

MoME: Mixture of Multimodal Experts for Generalist Multimodal Large Language Models.

Leyang ShenGongwei ChenRui ShaoWeili GuanLiqiang Nie
Published in: CoRR (2024)
Keyphrases