Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence.
Yung-Kyun NohMasashi SugiyamaSong LiuMarthinus Christoffel du PlessisFrank Chongwoo ParkDaniel D. LeePublished in: Neural Comput. (2018)
Keyphrases
- metric learning
- kullback leibler divergence
- nearest neighbor
- distance function
- distance metric
- probability density function
- distance measure
- mutual information
- semi supervised
- information theoretic
- pairwise
- learning tasks
- information theory
- k nearest neighbor
- dimensionality reduction
- knn
- probability density
- multi task
- semi supervised learning
- data points
- feature space
- mahalanobis distance
- density estimation
- diffusion tensor
- marginal distributions
- high dimensional data
- high dimensional
- euclidean distance
- random variables
- machine learning
- training set
- machine learning algorithms
- feature extraction
- image processing
- computer vision
- learning algorithm