MixCE: Training Autoregressive Language Models by Mixing Forward and Reverse Cross-Entropies.
Shiyue ZhangShijie WuOzan IrsoySteven LuMohit BansalMark DredzeDavid S. RosenbergPublished in: CoRR (2023)
Keyphrases
- language model
- autoregressive
- language modeling
- document retrieval
- moving average
- non stationary
- probabilistic model
- retrieval model
- n gram
- random fields
- gaussian markov random field
- speech recognition
- test collection
- query expansion
- information retrieval
- language modelling
- context sensitive
- statistical language models
- smoothing methods
- language models for information retrieval
- pseudo relevance feedback
- training set
- sar images
- document ranking
- translation model
- markov random field
- image retrieval