Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels
Miquel PayaróDaniel Pérez PalomarPublished in: CoRR (2009)
Keyphrases
- mutual information
- information theoretic
- information theory
- feature selection
- image registration
- information content
- shannon entropy
- kullback leibler divergence
- information theoretic measures
- similarity measure
- normalized mutual information
- conditional entropy
- information gain
- maximum likelihood
- medical image registration
- relative entropy
- multi channel
- neural network
- conditional mutual information
- multimodal image registration
- minimum error
- power consumption
- support vector
- machine learning