MathBERT: A Pre-trained Language Model for General NLP Tasks in Mathematics Education.
Jia Tracy ShenMichiharu YamashitaEthan PriharNeil HeffernanXintao WuDongwon LeePublished in: CoRR (2021)
Keyphrases
- language model
- language modeling
- pre trained
- n gram
- probabilistic model
- document retrieval
- mathematics education
- retrieval model
- query expansion
- ad hoc information retrieval
- speech recognition
- natural language
- test collection
- information retrieval
- neural network
- smoothing methods
- natural language processing
- context sensitive
- information extraction
- question answering
- translation model
- mixture model
- wordnet
- training data
- machine learning