Login / Signup

Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension.

An YangQuan WangJing LiuKai LiuYajuan LyuHua WuQiaoqiao SheSujian Li
Published in: ACL (1) (2019)
Keyphrases
  • pre trained
  • reading comprehension
  • neural network
  • e learning
  • prior knowledge
  • knowledge level
  • machine learning
  • language learning
  • vocabulary acquisition