Login / Signup
How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers.
Michael Hassid
Hao Peng
Daniel Rotem
Jungo Kasai
Ivan Montero
Noah A. Smith
Roy Schwartz
Published in:
CoRR (2022)
Keyphrases
</>
visual attention
real time
data sets
artificial intelligence
search engine
bayesian networks
computational complexity
hidden markov models