Login / Signup

How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers.

Michael HassidHao PengDaniel RotemJungo KasaiIvan MonteroNoah A. SmithRoy Schwartz
Published in: CoRR (2022)
Keyphrases
  • visual attention
  • real time
  • data sets
  • artificial intelligence
  • search engine
  • bayesian networks
  • computational complexity
  • hidden markov models