Soft-Labeled Contrastive Pre-Training for Function-Level Code Representation.
Xiaonan LiDaya GuoYeyun GongYun LinYelong ShenXipeng QiuDaxin JiangWeizhu ChenNan DuanPublished in: EMNLP (Findings) (2022)
Keyphrases
- supervised learning
- training set
- higher level
- training data
- levels of abstraction
- training process
- learning algorithm
- fully labeled
- labeled training data
- multi valued
- knowledge level
- lower level
- test set
- training examples
- data sets
- unlabeled data
- semi supervised learning
- image representation
- online learning
- open source