Abstract
Neural architecture search (NAS) is an increasingly popular method for the automatic design of neural networks. Although promising, NAS is often associated with a significant computational cost. Surrogate models, which predict the performance of candidate networks without training them, are thus used to speed up NAS calculations. Since surrogate models must be trained, their performance depends on the dataset of labelled candidate architectures. The generation of these samples can be time-consuming as it requires the training of an architecture. The present paper proposes an inexpensive way of generating training data for the surrogate model. Specifically, the proposed algorithm makes use of isomorphism to obtain more training data for the graph-based encoding. We propose an isomorphic training which combines the use of the Mean Squared Error (MSE) with a novel isomorphic loss function. Then, we propose an isomorphic score to predict the performance of candidate architectures. The proposed isomorphic-based surrogate is integrated within an evolutionary framework for NAS. Numerical experiments are performed on NAS-Bench101 and NAS-Bench201 search spaces. The experimental results demonstrate that the proposed Isomorphic Training and Prediction Evolutionary Neural Architecture Search (ITP-ENAS) algorithm can identify architectures with better performance than other state-of-the-art algorithms, despite training only 424 architectures.