Abstract
Designing effective neural architectures remains a central challenge in deep learning, and Neural Architecture Search (NAS) has become a popular tool for automating this process. However, many existing NAS approaches depend on hand-crafted architecture descriptors or shallow performance predictors, which fail to capture the structural complexity of candidate networks and often lead to unreliable search guidance. We introduce Graph Embedding Comparator with Isomorphic Multi-Comparison (GEC-IMC), an evolutionary NAS framework that learns architecture representations directly from their graph structure. A graph convolutional network encodes architectures into embeddings, while a contrastive learning strategy ensures that architectures with similar accuracy are mapped closer in the embedding space. On top of these embeddings, a comparator estimates the relative performance between two architectures, enabling more precise pairwise assessments during search. To further increase robustness, GEC-IMC incorporates an isomorphic multi-comparison mechanism, which evaluates multiple structurally equivalent variants of each architecture and aggregates their pairwise outcomes into a global score. This ranking score provides consistent feedback for evolutionary selection. Experiments on standard NAS benchmarks demonstrate that GEC-IMC achieves state-of-the-art performance with improved robustness over existing predictors. Ablation studies confirm the complementary roles of embedding learning and multi-comparison in enhancing search efficiency.