Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques.
Andrea MurariRiccardo RossiMichele LungaroniPasquale GaudioMichela GelfusaPublished in: Entropy (2020)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- machine learning
- information bottleneck
- information theoretic measures
- machine learning algorithms
- variable selection
- kullback leibler divergence
- minimum description length
- relative entropy
- log likelihood
- entropy measure
- random variables
- multi modality
- pattern recognition
- computational learning theory
- kl divergence
- image segmentation
- data mining