SwapNet: Garment Transfer in Single View Images

Amit Raj, Patsorn Sangkloy, Huiwen Chang, Jingwan Lu, Duygu Ceylan, James Hays; The European Conference on Computer Vision (ECCV), 2018, pp. 666-682

Abstract


We present SwapNet, a framework to transfer garments across images of people with arbitrary body pose, shape, and clothing. Garment transfer is a challenging task that requires (i) disentangling the features of the clothing from the body pose and shape and (ii) realistic synthesis of the garment texture on the new body. We present a neural network architecture that tackles these sub-problems with two task-specific sub-networks. Since acquiring pairs of images showing the same clothing on different bodies is difficult, we propose a novel weakly-supervised approach that generates training pairs from a single image via data augmentation. We present the first fully automatic method for garment transfer in unconstrained images without solving the difficult 3D reconstruction problem. We demonstrate a variety of transfer results and highlight our advantages over traditional image-to-image and analogy pipelines.

Related Material


[pdf]
[bibtex]
@InProceedings{Raj_2018_ECCV,
author = {Raj, Amit and Sangkloy, Patsorn and Chang, Huiwen and Lu, Jingwan and Ceylan, Duygu and Hays, James},
title = {SwapNet: Garment Transfer in Single View Images},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}