Login / Signup
Throughput optimizations for FPGA-based deep neural network inference.
Thorbjörn Posewsky
Daniel Ziener
Published in:
Microprocess. Microsystems (2018)
Keyphrases
</>
neural network
artificial neural networks
response time
back propagation
neural network model
training algorithm
genetic algorithm
inference process
bayesian inference
bayesian networks
hardware implementation
network model
pattern recognition
multilayer perceptron
application specific
hardware architecture
inference engine
image reconstruction from projections
feed forward
fermentation process
belief nets
neural network is trained
channel capacity
fuzzy artmap
feed forward neural networks
fuzzy neural network
self organizing maps
probabilistic inference
bp neural network
prediction model
belief networks
efficient implementation