Information-Theoretic Considerations in Batch Reinforcement Learning.
Jinglin ChenNan JiangPublished in: CoRR (2019)
Keyphrases
- information theoretic
- reinforcement learning
- information theory
- mutual information
- theoretic framework
- information bottleneck
- information theoretic measures
- log likelihood
- multi modality
- entropy measure
- minimum description length
- jensen shannon divergence
- kullback leibler divergence
- relative entropy
- computational learning theory
- machine learning
- distributional clustering
- bregman divergences
- closed form
- probability distribution
- similarity measure
- image processing
- learning algorithm