Bridging Information-Theoretic and Geometric Compression in Language Models.
Emily ChengCorentin KervadecMarco BaroniPublished in: EMNLP (2023)
Keyphrases
- information theoretic
- language model
- language modeling
- information theory
- mutual information
- probabilistic model
- theoretic framework
- n gram
- query expansion
- information retrieval
- speech recognition
- document retrieval
- context sensitive
- language modelling
- information theoretic measures
- test collection
- jensen shannon divergence
- geometric structure
- entropy measure
- smoothing methods
- statistical language models
- vector space model
- language models for information retrieval
- information bottleneck
- retrieval model
- query terms
- kullback leibler divergence
- document ranking
- mixture model
- image analysis
- image processing
- relevance model
- relevant documents
- generative model
- feature space
- pattern recognition
- decision trees
- computer vision