Abstract
Most recent few-shot learning (FSL) methods are based on meta-learning with
episodic training. In each meta-training episode, a discriminative feature
embedding and/or classifier are first constructed from a support set in an
inner loop, and then evaluated in an outer loop using a query set for model
updating. This query set sample centered learning objective is however
intrinsically limited in addressing the lack of training data problem in the
support set. In this paper, a novel contrastive prototype learning with
augmented embeddings (CPLAE) model is proposed to overcome this limitation.
First, data augmentations are introduced to both the support and query sets
with each sample now being represented as an augmented embedding (AE) composed
of concatenated embeddings of both the original and augmented versions. Second,
a novel support set class prototype centered contrastive loss is proposed for
contrastive prototype learning (CPL). With a class prototype as an anchor, CPL
aims to pull the query samples of the same class closer and those of different
classes further away. This support set sample centered loss is highly
complementary to the existing query centered loss, fully exploiting the limited
training data in each episode. Extensive experiments on several benchmarks
demonstrate that our proposed CPLAE achieves new state-of-the-art.