Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners.
Ningyu ZhangLuoqiu LiXiang ChenShumin DengZhen BiChuanqi TanFei HuangHuajun ChenPublished in: ICLR (2022)
Keyphrases
- language model
- pre trained
- language modeling
- n gram
- probabilistic model
- information retrieval
- document retrieval
- speech recognition
- language modelling
- learning environment
- query expansion
- training data
- statistical language models
- retrieval model
- smoothing methods
- document ranking
- test collection
- video sequences
- learning process
- relevance model
- language models for information retrieval
- e learning
- video content
- image sequences
- training examples
- video data
- machine learning
- text mining
- statistical language modeling
- high dimensional