• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Attention Alignment and Flexible Positional Embeddings Improve Transformer Length Extrapolation.

Ta-Chung ChiTing-Han FanAlexander I. Rudnicky
Published in: CoRR (2023)
Keyphrases
  • data mining
  • fuzzy logic
  • image registration
  • dimensionality reduction
  • lightweight
  • fault diagnosis
  • visual attention
  • incomplete data
  • image alignment