CLOWER: A Pre-trained Language Model with Contrastive Learning over Word and Character Representations.
Borun ChenHongyin TangJingang WangQifan WangHai-Tao ZhengWei WuLiqian YuPublished in: CoRR (2022)
Keyphrases
- language model
- n gram
- learning process
- probabilistic model
- word clouds
- learning algorithm
- language modeling
- pre trained
- supervised learning
- prior knowledge
- active learning
- document retrieval
- retrieval model
- mixture model
- statistical machine translation
- translation model
- ad hoc information retrieval
- context sensitive
- speech recognition
- query expansion
- dimensionality reduction
- co occurrence
- feature vectors
- training data