Logo image
Surrogate-Assisted Evolutionary Neural Architecture Search with Isomorphic Training and Prediction
Conference proceeding   Peer reviewed

Surrogate-Assisted Evolutionary Neural Architecture Search with Isomorphic Training and Prediction

Pengcheng Jiang, Yu Xue, Ferrante Neri and Mohamed Wahib
ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, Vol.14863, pp.191-203
Lecture Notes in Computer Science
01/01/2024

Abstract

Computer Science Computer Science, Artificial Intelligence Computer Science, Interdisciplinary Applications Computer Science, Theory & Methods Science & Technology Technology
Neural architecture search (NAS) is an increasingly popular method for the automatic design of neural networks. Although promising, NAS is often associated with a significant computational cost. Surrogate models, which predict the performance of candidate networks without training them, are thus used to speed up NAS calculations. Since surrogate models must be trained, their performance depends on the dataset of labelled candidate architectures. The generation of these samples can be time-consuming as it requires the training of an architecture. The present paper proposes an inexpensive way of generating training data for the surrogate model. Specifically, the proposed algorithm makes use of isomorphism to obtain more training data for the graph-based encoding. We propose an isomorphic training which combines the use of the Mean Squared Error (MSE) with a novel isomorphic loss function. Then, we propose an isomorphic score to predict the performance of candidate architectures. The proposed isomorphic-based surrogate is integrated within an evolutionary framework for NAS. Numerical experiments are performed on NAS-Bench101 and NAS-Bench201 search spaces. The experimental results demonstrate that the proposed Isomorphic Training and Prediction Evolutionary Neural Architecture Search (ITP-ENAS) algorithm can identify architectures with better performance than other state-of-the-art algorithms, despite training only 424 architectures.

Metrics

1 Record Views

Details

Logo image

Usage Policy