Boosting fuzzer efficiency: an information theoretic perspective.
Marcel BöhmeValentin J. M. ManèsSang Kil ChaPublished in: ESEC/SIGSOFT FSE (2020)
Keyphrases
- information theoretic
- information theory
- mutual information
- theoretic framework
- log likelihood
- information bottleneck
- information theoretic measures
- entropy measure
- jensen shannon divergence
- relative entropy
- minimum description length
- multi modality
- kullback leibler divergence
- computational learning theory
- kl divergence
- data mining
- multi modal
- support vector
- image classification
- image processing
- feature selection
- machine learning