Abstract
We present the first one-shot personalized sketch segmentation method. We aim
to segment all sketches belonging to the same category provisioned with a
single sketch with a given part annotation while (i) preserving the parts
semantics embedded in the exemplar, and (ii) being robust to input style and
abstraction. We refer to this scenario as personalized. With that, we
importantly enable a much-desired personalization capability for downstream
fine-grained sketch analysis tasks. To train a robust segmentation module, we
deform the exemplar sketch to each of the available sketches of the same
category. Our method generalizes to sketches not observed during training. Our
central contribution is a sketch-specific hierarchical deformation network.
Given a multi-level sketch-strokes encoding obtained via a graph convolutional
network, our method estimates rigid-body transformation from the target to the
exemplar, on the upper level. Finer deformation from the exemplar to the
globally warped target sketch is further obtained through stroke-wise
deformations, on the lower level. Both levels of deformation are guided by mean
squared distances between the keypoints learned without supervision, ensuring
that the stroke semantics are preserved. We evaluate our method against the
state-of-the-art segmentation and perceptual grouping baselines re-purposed for
the one-shot setting and against two few-shot 3D shape segmentation methods. We
show that our method outperforms all the alternatives by more than $10\%$ on
average. Ablation studies further demonstrate that our method is robust to
personalization: changes in input part semantics and style differences.