Xingxing Yao
Publication Activity (10 Years)
Years Active: 2004-2023
Publications (10 Years): 5
Publications (10 Years): 5
Top Topics
Publications
- Hai Liu, Xingxing Yao, Xiangyu KongTraining Model by Knowledge Distillation for Image-text Matching Use knowledge distillation method to compress pre-trained models in Image-Text matching tasks.Design lightweight models and use knowledge distillation methods to achieve better results for previously ineffective models after training. ICAICE (2023)
- Hao Wu, Xingxing Yao, Baodi Liu, Xiaoping Lu, Weifeng Liu
- Minglian Duan, Xingxing Yao, Jiuzhen Zhang