Compressing Recurrent Neural Networks Using Hierarchical Tucker Tensor Decomposition.
Miao YinSiyu LiaoXiao-Yang LiuXiaodong WangBo YuanPublished in: CoRR (2020)
Keyphrases
- recurrent neural networks
- tensor decomposition
- data representation
- auxiliary information
- high order
- neural network
- tensor factorization
- feed forward
- echo state networks
- low rank
- artificial neural networks
- recurrent networks
- visual data
- higher order
- matrix factorization
- semi supervised
- domain knowledge
- xml documents
- machine learning