GHN-QAT: Training Graph Hypernetworks to Predict Quantization-Robust Parameters of Unseen Limited Precision Neural Networks.
Stone YunAlexander WongPublished in: CoRR (2023)
Keyphrases
- neural network
- training process
- parameter tuning
- training set
- training algorithm
- training examples
- multi layer perceptron
- multi layer
- radial basis function network
- pattern recognition
- artificial neural networks
- random walk
- feed forward neural networks
- high precision
- backpropagation algorithm
- feedforward neural networks
- neural network training
- structured data
- directed graph
- quantization error
- supervised learning
- training samples
- graph structure
- computational complexity
- maximum likelihood
- graph databases
- data sets
- error back propagation
- graph representation
- recurrent networks
- activation function
- parameter settings
- parameter values
- feed forward
- neural network model
- fault diagnosis
- back propagation
- decision trees
- learning algorithm