Abstract
Neural architecture search has attracted great attention in the research community and has been successfully applied in the industry recently. Differentiable architecture search (DARTS) is an efficient architecture search method. However, the networks searched by DARTS are often unstable due to the large gap in the architecture depth between the search phase and the verification phase. In addition, due to unfair exclusive competition between different candidate operations, DARTS is prone to skip connection aggregation, which may cause performance collapse. In this article, we propose progressive partial channel connections based on channel attention for differentiable architecture search (PA-DARTS) to solve the above problems. In the early stage of searching, we only select a few key channels for convolution using channel attention and reserve all candidate operations. As the search progresses, we gradually increase the number of channels and eliminate unpromising candidate operations to ensure that the search phase and verification phase are all carried out on 20 cells. Due to the existence of the partial channel connections based on channel attention, we can eliminate the unfair competition between operations and increase the stability of PA-DARTS. Experimental results showed that PA-DARTS could achieve 97.59% and 83.61% classification accuracy on CIFAR-10 and CIFAR-100, respectively. On ImageNet, our algorithm achieved 75.3% classification accuracy.