Login / Signup
Forgetting is Necessary for Recurrent Networks to Recover Sequences Longer than Network Size.
R. Koray Çiftçi
Rafet Akdeniz
Published in:
TSP (2020)
Keyphrases
</>
network size
recurrent networks
recurrent neural networks
feed forward
activation function
neural network
biologically inspired
communication cost
network structure
network parameters
back propagation
artificial neural networks
training data
social networks
small number
sensor networks
data structure
data sets