Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models.
Yuxuan LaiYijia LiuYansong FengSongfang HuangDongyan ZhaoPublished in: CoRR (2021)
Keyphrases
- language model
- multi granularity
- pre trained
- language modeling
- n gram
- multi user
- dynamic integration
- probabilistic model
- speech recognition
- retrieval model
- information retrieval
- training data
- location aware
- privacy protection
- control signals
- query expansion
- word segmentation
- relevance model
- training examples
- data mining
- appearance variations
- smoothing methods
- image sequences
- machine learning