Information-theoretic bounds on quantum advantage in machine learning.
Hsin-Yuan HuangRichard KuengJohn PreskillPublished in: CoRR (2021)
Keyphrases
- information theoretic
- machine learning
- mutual information
- information theory
- theoretic framework
- information theoretic measures
- information bottleneck
- upper bound
- entropy measure
- jensen shannon divergence
- pattern recognition
- log likelihood
- active learning
- computational learning theory
- machine learning algorithms
- machine learning methods
- data mining
- computer vision
- multi modality
- decision trees
- feature selection
- worst case
- minimum description length
- information extraction
- learning tasks
- kullback leibler divergence
- supervised learning
- learning algorithm
- model selection
- similarity measure
- image registration