Abstract
Neural architecture search (NAS) automates the design of neural networks, but faces high computational costs for evaluating the performance candidate architectures. Surrogate-assisted NAS methods use approximate computational models to get predictive estimation instead of real complete training, but also face the challenge of maintaining the balance between training cost and predictive effectiveness. In this paper, we propose a progressive neural predictor that uses score-based sampling (PNSS) to improve the performance of the surrogate model with limited training data. Different from existing algorithms that rely on initial sample selection, PNSS uses an online method to progressively select new samples of the surrogate model based on potential information from the previous search process. During the iterative process, the sampled scores are dynamically adjusted based on the prediction rankings in each round to keep track of good architectures, which gradually optimises the surrogate model. In this way, the processes of training the predictor and searching for architectures are jointly combined to improve the efficiency of sample utilization. In addition, the surrogate model with different degrees of training is assigned prediction confidence equal to the accuracy of the current stage. Experiments are conducted on NAS-Bench-101 and NAS-Bench-201 benchmarks. The experimental results show that the proposed PNSS algorithm outperforms the existing methods with limited training samples. In addition, visualisation of the search process and ablation study also shows the effectiveness of the progressive search.