Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning.
Lin ZhangLi ShenLiang DingDacheng TaoLing-Yu DuanPublished in: CVPR (2022)
Keyphrases
- prior knowledge
- experimental data
- fine tuning
- raw data
- learning models
- background knowledge
- expert knowledge
- hidden variables
- probabilistic model
- training data
- learned models
- simulation data
- data mining techniques
- knowledge acquisition
- learning scheme
- accurate models
- human experts
- prior domain knowledge
- data sets
- database
- high level
- learning systems
- additional knowledge
- learning algorithm
- input data
- data collection
- statistical methods
- network structure
- conceptual model
- domain knowledge
- learning process
- data analysis
- em algorithm
- bayesian methods
- sensory data
- computational model
- global knowledge
- test data
- social networks
- missing information
- knowledge base
- learning mechanism
- global information
- empirical data
- knowledge transfer
- knowledge discovery