DistillSpec: Improving Speculative Decoding via Knowledge Distillation.
Yongchao ZhouKaifeng LyuAnkit Singh RawatAditya Krishna MenonAfshin RostamizadehSanjiv KumarJean-François KagyRishabh AgarwalPublished in: CoRR (2023)
Keyphrases
- domain knowledge
- expert systems
- higher level
- knowledge representation
- knowledge acquisition
- knowledge discovery
- machine learning
- domain experts
- data mining techniques
- video sequences
- prior knowledge
- knowledge management
- knowledge base
- feature selection
- neural network
- human experts
- knowledge sources
- knowledge extraction