Abstract
Evolutionary neural architecture search (ENAS) treats neural network design as an optimisation problem and addresses it via evolutionary computation. Despite being flexible and enabling automated design, ENAS typically suffers from high computational costs due to the need to train a network at each fitness evaluation. Surrogate-assisted ENAS methods mitigate the severity of this challenge by replacing the computationally expensive fitness function with an approximate computationally cheap fitness function for some fitness evaluations of the run. Currently, a major research challenge in the field is the smooth integration of such surrogate models (and, often, data collection mechanisms) within ENAS frameworks. This paper puts forth a simple yet effective way to address this challenge. During the initial stage of the optimisation, the proposed algorithm, score predictor-assisted ENAS (SPNAS), evolves a small population of candidate architectures using ground truth fitness, i.e., the testing error rate of the network following its training. The data collected in this stage are then used to train a multi-layer perceptron network that builds an alternative fitness function. Unlike algorithms in previous studies, this novel alternative fitness does not approximate the error rate but is designed to preserve its order relation over populations of candidate architectures. Thus, this approach naturally allows for a computationally cheap population ranking. Most of the evolution is then carried out with the surrogate (i.e., alternative) fitness on a large population without retraining the surrogate model or calculating the ground truth fitness. Experiments conducted on the EvoXBench platform show that on its seven search spaces, SPNAS achieves excellent results in terms of error rate despite the modest use of ground-truth fitness calls.