Making Pre-trained Language Models End-to-end Few-shot Learners with Contrastive Prompt Tuning.
Ziyun XuChengyu WangMinghui QiuFuli LuoRunxin XuSongfang HuangJun HuangPublished in: WSDM (2023)
Keyphrases
- end to end
- language model
- pre trained
- language modeling
- speech recognition
- document retrieval
- n gram
- probabilistic model
- retrieval model
- training data
- language modelling
- information retrieval
- query expansion
- test collection
- congestion control
- language models for information retrieval
- smoothing methods
- learning environment
- learning process
- control signals
- e learning
- statistical language models
- training examples
- video shots
- video sequences
- relevance model
- computer vision
- multi modal
- small number
- image sequences
- multimedia