OPT: Open Pre-trained Transformer Language Models.
Susan ZhangStephen RollerNaman GoyalMikel ArtetxeMoya ChenShuohui ChenChristopher DewanMona T. DiabXian LiXi Victoria LinTodor MihaylovMyle OttSam ShleiferKurt ShusterDaniel SimigPunit Singh KouraAnjali SridharTianlu WangLuke ZettlemoyerPublished in: CoRR (2022)
Keyphrases
- neural network
- language model
- pre trained
- control signals
- language modeling
- n gram
- training data
- speech recognition
- document retrieval
- information retrieval
- retrieval model
- test collection
- training examples
- probabilistic model
- language modelling
- smoothing methods
- statistical language models
- query expansion
- language models for information retrieval
- multi modal
- training set