CooGAN: A Memory-Efficient Framework for High-Resolution Facial Attribute Editing

Xuanhong Chen, Bingbing Ni, Naiyuan Liu, Ziang Liu, Yiliu Jiang, Loc Truong, Qi Tian ;


In contrast to great success of memory-consuming face editing methods at a low resolution, to manipulate high-resolution (HR) facial images, \ie, typically larger than $768^2$ pixels, with very limited memory is still challenging. This is due to the reasons of 1) intractable huge demand of memory; 2) inefficient multi-scale features fusion. To address these issues, we propose a NOVEL pixel translation framework called mph{Cooperative GAN (CooGAN)} for HR facial image editing. This framework features a local path for fine-grained local facial patch generation (\ie, patch-level HR, LOW memory) and a global path for global low-resolution (LR) facial structure monitoring (\ie, image-level LR, LOW memory), which largely reduce memory requirements. Both paths work in a cooperative manner under a local-to-global consistency objective (\ie, for smooth stitching). In addition, we propose a lighter selective transfer unit for more efficient multi-scale features fusion, yielding higher fidelity facial attributes manipulation. Extensive experiments on CelebA-HQ well demonstrate the memory efficiency as well as the high image generation quality of the proposed framework. \keywords{Generative Adversarial Networks, Conditional GANs, Face Attributes Editing"

Related Material