A 28nm 53.8TOPS/W 8b Sparse Transformer Accelerator with In-Memory Butterfly Zero Skipper for Unstructured-Pruned NN and CIM-Based Local-Attention-Reusable Engine.
Shiwei LiuPeizhe LiJinshan ZhangYunzhengmao WangHaozhe ZhuWenning JiangShan TangChixiao ChenQi LiuMing LiuPublished in: ISSCC (2023)
Keyphrases
- nearest neighbor
- neural network
- compute intensive
- fuzzy logic
- high dimensional
- sparse representation
- knn
- short term memory
- neural network model
- memory requirements
- focus of attention
- sparse data
- parallel implementation
- memory usage
- artificial neural networks
- artificial intelligence
- compressive sensing
- semi structured
- partial discharge
- computer integrated manufacturing
- bp neural network
- feed forward
- fault diagnosis
- software systems
- power system
- back propagation
- genetic algorithm