Login / Signup

Recurrent attention unit: A new gated recurrent unit for long-term memory of important parts in sequential data.

Zhaoyang NiuGuoqiang ZhongGuohua YueLi-Na WangHui YuXiao LingJunyu Dong
Published in: Neurocomputing (2023)
Keyphrases
  • feature space
  • sequential data
  • long term memory
  • high dimensional
  • hidden markov models
  • fixed length
  • genetic algorithm
  • short term
  • short term memory