UoR at SemEval-2021 Task 4: Using Pre-trained BERT Token Embeddings for Question Answering of Abstract Meaning.
Thanet MarkchomHuizhi LiangPublished in: SemEval@ACL/IJCNLP (2021)
Keyphrases
- question answering
- pre trained
- natural language
- training data
- information retrieval
- question classification
- passage retrieval
- training examples
- control signals
- natural language processing
- information extraction
- cross language
- natural language questions
- syntactic information
- low dimensional
- word sense disambiguation
- multimedia
- answer validation
- qa systems
- automatically generated
- vector space
- high dimensional data
- dimensionality reduction
- semantic roles
- candidate answers
- language model
- supervised learning
- knowledge representation