An Energy-Efficient Near-Memory Computing Architecture for CNN Inference at Cache Level.
Masoud NouripayamArturo PrietoVignajeth Kuttuva KishorelalJoachim RodriguesPublished in: ICECS (2021)
Keyphrases
- memory hierarchy
- memory access
- main memory
- memory subsystem
- memory management
- multithreading
- application level
- computing power
- cellular neural networks
- inference engine
- real time
- energy efficient
- associative memory
- garbage collection
- bayesian networks
- cache misses
- cache conscious
- random access
- memory requirements
- data access
- wireless sensor networks
- data streams
- neural network