MasakhaNER 2.0: Africa-centric Transfer Learning for Named Entity Recognition.
David Ifeoluwa AdelaniGraham NeubigSebastian RuderShruti RijhwaniMichael BeukmanChester Palen-MichelConstantine LignosJesujoba O. AlabiShamsuddeen Hassan MuhammadPeter NabendeCheikh M. Bamba DioneAndiswa BukulaRooweither MabuyaBonaventure F. P. DossouBlessing SibandaHappy BuzaabaJonathan MukiibiGodson KalipeDerguene MbayeAmelia TaylorFatoumata Ouoba KaboreChris Chinenye EmezueAremu AnuoluwapoPerez OgayoCatherine GitauEdwin Munkoh-BuabengVictoire Memdjokam KoagneAllahsera Auguste TapoTebogo MacucwaVukosi MarivateElvis MboningTajuddeen GwadabeTosin P. AdewumiOrevaoghene AhiaJoyce Nakatumba-NabendeNeo L. MokonoIgnatius EzeaniChiamaka ChukwunekeMofetoluwa AdeyemiGilles HachemeIdris AbdulmuminOdunayo OgundepoOreen YousufTatiana Moteu NgoliDietrich KlakowPublished in: CoRR (2022)
Keyphrases
- transfer learning
- named entity recognition
- semi supervised
- labeled data
- information extraction
- named entities
- natural language processing
- learning tasks
- semi supervised learning
- maximum entropy
- knowledge transfer
- text summarization
- cross domain
- machine learning
- active learning
- reinforcement learning
- text mining
- conditional random fields
- unlabeled data
- annotated corpus
- classifier ensemble
- text classification
- domain adaptation
- collaborative filtering
- machine learning algorithms
- multi task
- transfer knowledge
- text categorization
- supervised learning
- pairwise
- question answering
- target domain
- data sets
- data points
- hidden markov models
- computer vision