Training deep neural-networks using a noise adaptation layer.
Jacob GoldbergerEhud Ben-ReuvenPublished in: ICLR (Poster) (2017)
Keyphrases
- neural network
- multi layer
- training process
- training algorithm
- error back propagation
- feed forward neural networks
- feedforward neural networks
- train a neural network
- multi layer perceptron
- random noise
- backpropagation algorithm
- artificial neural networks
- deep architectures
- noise level
- additive noise
- neural network training
- feed forward
- online learning
- signal to noise ratio
- multiple layers
- restricted boltzmann machine
- noise model
- genetic algorithm
- radial basis function network
- image noise
- noisy environments
- single layer
- covariate shift
- data sets
- noise reduction
- back propagation
- test set
- missing data
- noisy data
- pattern recognition
- multilayer perceptron
- supervised learning
- fuzzy logic
- training set
- fuzzy systems
- trained neural network
- deep belief networks
- neural network structure
- gaussian noise
- activation function
- training and testing data