Derivative of the relative entropy over the poisson and Binomial channel.
Camilo G. TabordaFernando Pérez-CruzPublished in: ITW (2012)
Keyphrases
- relative entropy
- information theoretic
- information theory
- mutual information
- log likelihood
- covariance matrix
- mahalanobis distance
- squared euclidean distance
- closed form
- maximum entropy
- kullback leibler divergence
- probability distribution
- image processing
- feature selection
- graphical models
- bregman divergences
- objective function