Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training.
Anup SarmaSonali SinghHuaipan JiangRui ZhangMahmut T. KandemirChita R. DasPublished in: NeurIPS (2021)
Keyphrases
- recurrent neural networks
- feedforward neural networks
- structured data
- real time
- search space
- lightweight
- database
- computationally efficient
- recurrent networks
- training algorithm
- computationally expensive
- online learning
- artificial neural networks
- search algorithm
- training data
- decision trees
- e learning
- information systems