Improved Categorical Cross-Entropy Loss for Training Deep Neural Networks with Noisy Labels.
Panle LiXiaohui HeDingjun SongZihao DingMengjia QiaoXijie ChengRunchuan LiPublished in: PRCV (4) (2021)
Keyphrases
- cross entropy
- neural network
- training process
- training set
- training examples
- training algorithm
- feedforward neural networks
- maximum likelihood
- error function
- log likelihood
- artificial neural networks
- training data
- supervised learning
- recurrent neural networks
- multi layer perceptron
- fuzzy logic
- evaluation metrics
- language modeling
- neural nets
- training phase
- neural network model
- feature space