XNOR Neural Engine: a Hardware Accelerator IP for 21.6 fJ/op Binary Neural Network Inference.
Francesco ContiPasquale Davide SchiavoneLuca BeniniPublished in: CoRR (2018)
Keyphrases
- neural network
- network architecture
- inference engine
- neural model
- neural fuzzy
- low cost
- field programmable gate array
- hardware and software
- artificial neural networks
- associative memory
- neuron model
- real time
- neural architecture
- parallel implementation
- neural computation
- learning rules
- activation function
- genetic algorithm
- neural learning
- back propagation
- probabilistic inference
- bp neural network
- fuzzy neural network
- fault diagnosis
- multi layer perceptron
- neural network model
- fuzzy logic
- computer systems
- radial basis function
- bayesian networks
- embedded systems
- recurrent networks
- inference process
- feed forward neural networks
- multilayer perceptron
- self organizing maps
- computing systems
- fuzzy systems
- hardware architecture
- hebbian learning
- feed forward