An RRAM-Based Computing-in-Memory Architecture and Its Application in Accelerating Transformer Inference.
Zhaojun LuXueyan WangMd Tanvir ArafinHaoxiang YangZhenglin LiuJiliang ZhangGang QuPublished in: IEEE Trans. Very Large Scale Integr. Syst. (2024)
Keyphrases
- probability distribution
- bayesian networks
- belief networks
- inference engine
- management system
- memory requirements
- real time
- probabilistic inference
- memory hierarchy
- inference process
- fuzzy logic
- fault diagnosis
- memory usage
- memory subsystem
- architectural design
- memory space
- data flow
- bayesian inference
- associative memory
- hidden markov models
- expert systems
- knowledge base
- artificial intelligence