Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models.
Yuxuan LaiYijia LiuYansong FengSongfang HuangDongyan ZhaoPublished in: NAACL-HLT (2021)
Keyphrases
- feature space
- language model
- multi granularity
- pre trained
- language modeling
- n gram
- multi user
- probabilistic model
- dynamic integration
- training data
- retrieval model
- training examples
- information retrieval
- speech recognition
- word segmentation
- query expansion
- location aware
- privacy protection
- language models for information retrieval
- training set
- smoothing methods
- seamless integration
- information extraction
- decision trees
- computer vision
- neural network