QUEST: Multi-Purpose Log-Quantized DNN Inference Engine Stacked on 96-MB 3-D SRAM Using Inductive Coupling Technology in 40-nm CMOS.
Kodai UeyoshiKota AndoKazutoshi HiroseShinya Takamaeda-YamazakiMototsugu HamadaTadahiro KurodaMasato MotomuraPublished in: IEEE J. Solid State Circuits (2019)
Keyphrases
- inference engine
- cmos technology
- nm technology
- power consumption
- low power
- silicon on insulator
- expert systems
- knowledge representation
- knowledge base
- rule base
- metal oxide semiconductor
- knowledge based systems
- low voltage
- knowledge representation language
- backward chaining
- blackboard architecture
- dynamic random access memory
- low cost
- machine learning
- random access memory
- artificial intelligence
- data mining
- power management
- image sensor
- fuzzy logic
- neural network