Neural Networks Trained by Weight Permutation are Universal Approximators.
Yongqiang CaiGaohang ChenZhonghua QiaoPublished in: CoRR (2024)
Keyphrases
- neural network
- multilayer perceptron
- trained neural networks
- multi layer perceptron
- training process
- synaptic weights
- trained neural network
- back propagation
- auto associative
- backpropagation algorithm
- weight update
- artificial neural networks
- neural network model
- pattern recognition
- radial basis function
- activation function
- network architecture
- hidden layer
- fuzzy logic
- svm classifier
- multilayer neural network
- elman network
- rule extraction
- feed forward
- self organizing maps
- neural learning
- hidden units
- genetic algorithm
- neural network is trained
- number of hidden layers
- adaptive resonance theory
- back propagation neural network
- function approximators
- feedforward neural networks
- fuzzy neural network
- multi layer
- neural nets
- fault diagnosis
- support vector machine
- evolutionary algorithm