Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning.
Lin ZhangLi ShenLiang DingDacheng TaoLing-Yu DuanPublished in: CoRR (2022)
Keyphrases
- prior knowledge
- fine tuning
- experimental data
- expert knowledge
- raw data
- simulation data
- learning models
- probability distribution
- accurate models
- background knowledge
- learned models
- human experts
- knowledge acquisition
- data mining techniques
- knowledge management
- data sources
- probabilistic model
- network structure
- conceptual model
- data sets
- sensory data
- domain experts
- learning systems
- data collection
- input data
- database
- training data
- learning process
- prior domain knowledge
- missing data
- missing information
- distributed data
- bayesian methods
- knowledge transfer
- data analysis
- domain knowledge
- statistical methods
- knowledge discovery
- expert systems
- global information
- learning mechanism
- data structure
- high level
- machine learning
- data points
- additional knowledge