Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction.
Masahiro KanekoMasato MitaShun KiyonoJun SuzukiKentaro InuiPublished in: CoRR (2020)
Keyphrases
- error correction
- language model
- error control
- probabilistic model
- noisy channel
- turbo codes
- language modelling
- statistical language models
- language modeling
- error detection
- n gram
- speech recognition
- low complexity
- smoothing methods
- ldpc codes
- information retrieval
- pre trained
- reed solomon
- relevance model
- retrieval model
- query expansion
- rate distortion
- statistical language modeling
- hidden markov models