CLOWER: A Pre-trained Language Model with Contrastive Learning over Word and Character Representations.
Borun ChenHongyin TangJiahao BuKai ZhangJingang WangQifan WangHai-Tao ZhengWei WuLiqian YuPublished in: COLING (2022)
Keyphrases
- language model
- n gram
- word clouds
- document retrieval
- reinforcement learning
- language modeling
- learning algorithm
- information retrieval
- active learning
- probabilistic model
- supervised learning
- co occurrence
- prior knowledge
- retrieval model
- learning process
- statistical machine translation
- neural network
- labeled data
- face recognition
- test collection
- context sensitive
- pre trained
- word error rate