Few-bit Backward: Quantized Gradients of Activation Functions for Memory Footprint Reduction.
Georgii Sergeevich NovikovDaniel BershatskyJulia GusakAlex ShonenkovDenis Valerievich DimitrovIvan V. OseledetsPublished in: ICML (2023)
Keyphrases
- memory footprint
- activation function
- neural network
- neural architecture
- memory usage
- feed forward
- artificial neural networks
- back propagation
- hidden layer
- learning rate
- neural nets
- network architecture
- hidden nodes
- multilayer perceptron
- fuzzy neural network
- training phase
- significant bit
- parametric models
- input space
- radial basis function
- signal processing
- dimensionality reduction
- fuzzy logic
- knn
- dynamic programming
- artificial intelligence
- genetic algorithm
- machine learning