Hybrid 8-bit Floating Point (HFP8) Training and Inference for Deep Neural Networks.
Xiao SunJungwook ChoiChia-Yu ChenNaigang WangSwagath VenkataramaniVijayalakshmi SrinivasanXiaodong CuiWei ZhangKailash GopalakrishnanPublished in: NeurIPS (2019)
Keyphrases
- floating point
- neural network
- training process
- fixed point
- training algorithm
- feedforward neural networks
- square root
- back propagation
- bayesian networks
- structured prediction
- multi layer perceptron
- feed forward neural networks
- multilayer neural network
- deep architectures
- training set
- instruction set
- sparse matrices
- artificial neural networks
- higher order
- fast fourier transform
- floating point arithmetic
- hidden layer
- multilayer perceptron