BERTweetFR : Domain Adaptation of Pre-Trained Language Models for French Tweets.
Yanzhu GuoVirgile RennardChristos XypolopoulosMichalis VazirgiannisPublished in: W-NUT (2021)
Keyphrases
- language model
- domain adaptation
- pre trained
- training data
- language modeling
- labeled data
- document retrieval
- probabilistic model
- cross domain
- training examples
- information retrieval
- n gram
- multiple sources
- named entities
- test collection
- target domain
- semi supervised
- query expansion
- unlabeled data
- retrieval model
- speech recognition
- news articles
- semi supervised learning
- sentiment classification
- document classification
- transfer learning
- supervised learning
- training set
- learning algorithm
- pairwise
- learning process
- prior knowledge
- information extraction