Sparse ReRAM engine: joint exploration of activation and weight sparsity in compressed neural networks.
Tzu-Hsien YangHsiang-Yun ChengChia-Lin YangI-Ching TsengHan-Wen HuHung-Sheng ChangHsiang-Pang LiPublished in: ISCA (2019)
Keyphrases
- neural network
- sparse representation
- high dimensional
- sparsity constraints
- sparse approximation
- mixed norm
- joint optimization
- weight update
- sparse data
- back propagation
- compressed sensing
- compressive sampling
- fuzzy logic
- pattern recognition
- basis pursuit
- sparse coding
- artificial neural networks
- sparse approximations
- information processing
- dictionary learning
- regularized regression
- self organizing maps
- neural network model
- feed forward
- multi layer
- group lasso
- sparse reconstruction
- activation function
- recurrent neural networks
- neural nets
- random projections
- tensor factorization
- data compression
- multilayer perceptron
- orthogonal matching pursuit
- sparsity inducing
- denoising
- data structure
- training process
- compressive sensing
- convex optimization
- elastic net
- fault diagnosis
- signal processing
- learning algorithm
- genetic algorithm