XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference.
Francesco ContiPasquale Davide SchiavoneLuca BeniniPublished in: IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. (2018)
Keyphrases
- neural network
- network architecture
- neural model
- neural fuzzy
- field programmable gate array
- learning rules
- hardware and software
- low cost
- distributed representations
- real time
- back propagation
- inference engine
- neural computation
- neural architecture
- image processing
- bayesian networks
- artificial neural networks
- neural learning
- neural network model
- feed forward
- neuron model
- probabilistic inference
- embedded systems
- genetic algorithm
- fuzzy systems
- associative memory
- self organizing maps
- bio inspired
- computer systems
- massively parallel
- fuzzy logic
- neural network is trained
- control unit
- inference process
- learning algorithm
- activation function
- hardware implementation
- fault diagnosis
- fuzzy inference system
- biologically plausible
- fuzzy neural network
- expert systems
- hebbian learning
- bp neural network
- prediction model
- belief networks
- recurrent neural networks