Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search.
Alexander ChebykinArkadiy DushatskiyTanja AlderliestenPeter A. N. BosmanPublished in: CoRR (2023)
Keyphrases
- neural architecture
- multi layer perceptron
- neural network
- feed forward
- search space
- activation function
- training set
- basis functions
- artificial intelligence
- artificial neural networks
- search algorithm
- evolutionary search
- multilayer perceptron
- processing units
- simulated annealing
- evolutionary algorithm
- differential evolution
- hidden layer
- decision trees