Examining the causal structures of deep neural networks using information theory.
Simon MattssonEric J. MichaudErik HoelPublished in: CoRR (2020)
Keyphrases
- information theory
- neural network
- information theoretic
- statistical learning
- jensen shannon divergence
- statistical mechanics
- pattern recognition
- conditional entropy
- statistical physics
- kullback leibler divergence
- artificial neural networks
- rate distortion theory
- relative entropy
- causal relationships
- information geometry
- observational data
- causal models
- mutual information
- probabilistic model
- bayesian networks