BERTweetFR : Domain Adaptation of Pre-Trained Language Models for French Tweets.
Yanzhu GuoVirgile RennardChristos XypolopoulosMichalis VazirgiannisPublished in: CoRR (2021)
Keyphrases
- domain adaptation
- language model
- pre trained
- training data
- language modeling
- labeled data
- cross domain
- training examples
- n gram
- multiple sources
- speech recognition
- probabilistic model
- document retrieval
- query expansion
- retrieval model
- named entities
- information retrieval
- semi supervised
- target domain
- transfer learning
- co training
- semi supervised learning
- news articles
- unlabeled data
- sentiment classification
- document classification
- data mining
- learning algorithm
- decision trees
- test collection
- supervised learning
- data sets
- co occurrence
- machine learning
- small number