Login / Signup
The Hidden Power of Pure 16-bit Floating-Point Neural Networks.
Juyoung Yun
Byungkon Kang
Zhoulai Fu
Published in:
CoRR (2023)
Keyphrases
</>
floating point
neural network
fixed point
square root
instruction set
power consumption
floating point arithmetic
sparse matrices
back propagation
floating point unit
artificial neural networks
hidden information
image processing
general purpose
interval arithmetic