LoRAMoE: Alleviating World Knowledge Forgetting in Large Language Models via MoE-Style Plugin.
Shihan DouEnyu ZhouYan LiuSongyang GaoWei ShenLimao XiongYuhao ZhouXiao WangZhiheng XiXiaoran FanShiliang PuJiang ZhuRui ZhengTao GuiQi ZhangXuanjing HuangPublished in: ACL (1) (2024)
Keyphrases
- language model
- world knowledge
- n gram
- language modeling
- bag of words
- knowledge sources
- feature generation
- noun phrases
- probabilistic model
- document retrieval
- retrieval model
- information retrieval
- query expansion
- background knowledge
- pseudo relevance feedback
- query terms
- natural language text
- test collection
- vector space model
- document representation
- domain knowledge
- retrieval effectiveness
- part of speech
- generative model
- feature selection