Login / Signup
Parallelizing Linear Recurrent Neural Nets Over Sequence Length.
Eric Martin
Chris Cundy
Published in:
CoRR (2017)
Keyphrases
</>
neural nets
feed forward
back propagation
neural network
multi layer
fixed length
backpropagation neural networks
artificial neural networks
learning tasks
recurrent neural networks
machine learning
single layer
real world
information retrieval
longest common subsequence
shift register