Efficient recurrent architectures through activity sparsity and sparse back-propagation through time.
Anand SubramoneyKhaleelulla Khan NazeerMark SchöneChristian MayrDavid KappelPublished in: ICLR (2023)
Keyphrases
- back propagation
- feed forward
- artificial neural networks
- neural network
- high dimensional
- sparse representation
- bp neural network
- bp algorithm
- multilayer perceptron
- neural nets
- hidden layer
- training algorithm
- fuzzy logic
- bp network
- levenberg marquardt
- radial basis
- sparsity constraints
- error back propagation
- multilayer perceptron neural network
- recurrent neural networks
- learning algorithm
- feed forward neural networks
- least squares
- connection weights
- trained neural network
- expert systems
- genetic algorithm
- cascade correlation
- machine learning