Abstract
•A novel heterogeneous search space for NAS with richer primitive operations (e.g., feature self-calibration).•A novel Neural Operator Search (NOS) method dedicated for NAS in the proposed heterogeneous search space.•Our approach is highly competitive on both CI-FAR and ImageNet-mobile image classification tests.
Existing neural architecture search (NAS) methods usually explore a limited feature-transformation-only search space, ignoring other advanced feature operations such as feature self-calibration by attention and dynamic convolutions. This disables the NAS algorithms to discover more advanced network architectures. We address this limitation by additionally exploiting feature self-calibration operations, resulting in a heterogeneous search space. To solve the challenges of operation heterogeneity and significantly larger search space, we formulate a neural operator search (NOS) method. NOS presents a novel heterogeneous residual block for integrating the heterogeneous operations in a unified structure, and an attention guided search strategy for facilitating the search process over a vast space. Extensive experiments show that NOS can search novel cell architectures with highly competitive performance on the CIFAR and ImageNet benchmarks.