Information-Theoretic Considerations in Batch Reinforcement Learning.
Jinglin ChenNan JiangPublished in: ICML (2019)
Keyphrases
- information theoretic
- reinforcement learning
- information theory
- mutual information
- theoretic framework
- information bottleneck
- entropy measure
- multi modality
- log likelihood
- kullback leibler divergence
- information theoretic measures
- jensen shannon divergence
- relative entropy
- machine learning
- minimum description length
- rough sets
- computational learning theory
- semi supervised
- learning algorithm
- kl divergence
- learning problems