Login / Signup

Easy attention: A simple self-attention mechanism for Transformers.

Marcial Sanchis-AgudoYuning WangKarthik DuraisamyRicardo Vinuesa
Published in: CoRR (2023)
Keyphrases
  • attention mechanism
  • visual attention
  • saliency map
  • multiscale
  • high resolution
  • natural images
  • human perception