Information-Theoretic Confidence Bounds for Reinforcement Learning.
Xiuyuan LuBenjamin Van RoyPublished in: CoRR (2019)
Keyphrases
- information theoretic
- confidence bounds
- reinforcement learning
- information theory
- mutual information
- theoretic framework
- information bottleneck
- information theoretic measures
- jensen shannon divergence
- multi modality
- entropy measure
- learning algorithm
- kl divergence
- log likelihood
- minimum description length
- kullback leibler divergence
- machine learning
- computational learning theory
- relative entropy
- image classification
- distributional clustering