InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective.
Boxin WangShuohang WangYu ChengZhe GanRuoxi JiaBo LiJingjing LiuPublished in: CoRR (2020)
Keyphrases
- information theoretic
- language model
- information theory
- language modeling
- mutual information
- n gram
- theoretic framework
- language modelling
- speech recognition
- probabilistic model
- document retrieval
- statistical language models
- information bottleneck
- retrieval model
- query expansion
- information theoretic measures
- information retrieval
- mixture model
- test collection
- jensen shannon divergence
- relevance model
- entropy measure
- kullback leibler divergence
- document ranking
- jensen shannon
- language models for information retrieval
- pseudo relevance feedback
- query terms
- smoothing methods
- context sensitive
- translation model
- vector space model
- probability distribution
- pattern recognition