Entropy Rate Estimates for Natural Language - A New Extrapolation of Compressed Large-Scale Corpora.
Ryosuke TakahiraKumiko Tanaka-IshiiLukasz DebowskiPublished in: Entropy (2016)
Keyphrases
- natural language
- natural language processing
- estimation error
- language processing
- small scale
- information theory
- data structure
- real world
- semantic representation
- mutual information
- information theoretic
- machine learning
- natural language interface
- real life
- conceptual graphs
- semantic interpretation
- natural language generation
- text corpora
- code length
- neural network
- natural language understanding
- semantic markup
- semantic analysis
- data compression
- machine translation
- rate distortion
- question answering
- text mining
- artificial intelligence