Hardware Implementation of Hyperbolic Tangent Activation Function for Floating Point Formats.
T. K. R. ArvindMarcel BrandChristian HeidornSrinivas BoppuFrank HannigJürgen TeichPublished in: VDAT (2020)
Keyphrases
- hardware implementation
- activation function
- floating point
- floating point arithmetic
- neural network
- signal processing
- efficient implementation
- fixed point
- feed forward
- field programmable gate array
- feedforward neural networks
- artificial neural networks
- neural nets
- hidden layer
- multilayer perceptron
- learning rate
- instruction set
- recurrent neural networks
- basis functions
- network architecture
- partial differential equations
- radial basis function
- linear combination
- memory management
- reinforcement learning
- feature extraction