Abstract
Neural Style Transfer (NST) is the field of study applying neural techniques to
modify the artistic appearance of a content image to match the style of a reference
style image. Traditionally, NST methods have focused on texture-based image
edits, affecting mostly low level information and keeping most image structures
the same. However, style-based deformation of the content is desirable for some
styles, especially in cases where the style is abstract or the primary concept of the
style is in its deformed rendition of some content. With the recent introduction of
diffusion models, such as Stable Diffusion, we can access far more powerful image
generation techniques, enabling new possibilities. In our work, we propose using
this new class of models to perform style transfer while enabling deformable style
transfer, an elusive capability in previous models. We show how leveraging the
priors of these models can expose new artistic controls at inference time, and we
document our findings in exploring this new direction for the field of style transfer.