Login / Signup

Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation.

Mozhdeh GheiniXiang RenJonathan May
Published in: EMNLP (1) (2021)
Keyphrases