image - arxiv.org · reshapegan: object reshaping by providing a single reference image ziqiang...

25
ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng * 1 , Yang Wu 2 , Zhibin Yu 1 , Yang Yang 3 , Haiyong Zheng 1 , and Takeo Kanade 4 1 Ocean University of China. 2 Nara Institute of Science and Technology, Japan. 3 University of Electronic Science and Technology of China. 4 Carnegie Mellon University, United States. Abstract The aim of this work is learning to reshape the object in an input image to an arbitrary new shape, by just simply providing a single reference image with an object instance in the desired shape. We propose a new Generative Ad- versarial Network (GAN) architecture for such an object reshaping problem, named ReshapeGAN. The network can be tailored for handling all kinds of problem settings, including both within-domain (or single-dataset) reshap- ing and cross-domain (typically across mutiple datasets) reshaping, with paired or unpaired training data. The appearance of the input object is preserved in all cases, and thus it is still identifiable after reshaping, which has never been achieved as far as we are aware. We present the tailored models of the proposed ReshapeGAN for all the problem settings, and have them tested on 8 kinds of reshaping tasks with 13 different datasets, demonstrat- ing the ability of ReshapeGAN on generating convinc- ing and superior results for object reshaping. To the best of our knowledge, we are the first to be able to make one GAN framework work on all such object reshap- ing tasks, especially the cross-domain tasks on handling multiple diverse datasets. We present here both ablation studies on our proposed ReshapeGAN models and com- parisons with the state-of-the-art models when they are made comparable, using all kinds of applicable metrics * equal contribution equal contribution that we are aware of. Code will be available at https: //github.com/zhengziqiang/ReshapeGAN. Input Ref. Output Figure 1: Object reshaping with ReshapeGAN, guided by another object from a single reference image (Ref.). The appearance is preserved (thus being still identifiable) while the reshaping is controllable by just choosing a ref- erence image with the desired shape. The reference can be from an arbitrary domain (same as or different from that of the input). 1 Introduction Many of us may have admired some others’ faces or bod- ies, though it may be hard for us to even dream about changing our own faces/bodies to those desired ones, not to say taking actions for actual reshaping. But what if the reshaping can be seen instantly, as shown in Fig. 1, by just 1 arXiv:1905.06514v1 [cs.CV] 16 May 2019

Upload: others

Post on 24-Sep-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

ReshapeGAN: Object Reshaping by Providing A Single ReferenceImage

Ziqiang Zheng∗ 1, Yang Wu†2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4

1Ocean University of China.2Nara Institute of Science and Technology, Japan.

3University of Electronic Science and Technology of China.4Carnegie Mellon University, United States.

Abstract

The aim of this work is learning to reshape the object inan input image to an arbitrary new shape, by just simplyproviding a single reference image with an object instancein the desired shape. We propose a new Generative Ad-versarial Network (GAN) architecture for such an objectreshaping problem, named ReshapeGAN. The networkcan be tailored for handling all kinds of problem settings,including both within-domain (or single-dataset) reshap-ing and cross-domain (typically across mutiple datasets)reshaping, with paired or unpaired training data. Theappearance of the input object is preserved in all cases,and thus it is still identifiable after reshaping, which hasnever been achieved as far as we are aware. We presentthe tailored models of the proposed ReshapeGAN for allthe problem settings, and have them tested on 8 kindsof reshaping tasks with 13 different datasets, demonstrat-ing the ability of ReshapeGAN on generating convinc-ing and superior results for object reshaping. To the bestof our knowledge, we are the first to be able to makeone GAN framework work on all such object reshap-ing tasks, especially the cross-domain tasks on handlingmultiple diverse datasets. We present here both ablationstudies on our proposed ReshapeGAN models and com-parisons with the state-of-the-art models when they aremade comparable, using all kinds of applicable metrics

∗equal contribution†equal contribution

that we are aware of. Code will be available at https://github.com/zhengziqiang/ReshapeGAN.

Input

Ref.

Output

Figure 1: Object reshaping with ReshapeGAN, guidedby another object from a single reference image (Ref.).The appearance is preserved (thus being still identifiable)while the reshaping is controllable by just choosing a ref-erence image with the desired shape. The reference canbe from an arbitrary domain (same as or different fromthat of the input).

1 Introduction

Many of us may have admired some others’ faces or bod-ies, though it may be hard for us to even dream aboutchanging our own faces/bodies to those desired ones, notto say taking actions for actual reshaping. But what if thereshaping can be seen instantly, as shown in Fig. 1, by just

1

arX

iv:1

905.

0651

4v1

[cs

.CV

] 1

6 M

ay 2

019

Page 2: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

providing a picture of ours and another one of some otherperson who has the desired shape? In greater details, ifthe virtual reshaping can preserve our identity and appear-ance, while at the same time transferring to the same niceexpression or pose as that of the admired person, won’tyou want to try and check this visible dream?

It is actually even beyond what dreaming can do, be-cause as humans we can hardly imagine the details of thechanges caused by such a reshaping if we have never seensimilar results before. A skilled artist may be able to dosuch translation and imagination, but it is definitely notan easy task. Nevertheless, we have the ability to feelhow good such reshaping results look like, even withouta ground truth. We can be somehow sensitive about howwell the identity and appearance are preserved, how wellthe changed shape matches that of the reference image,and how realistic the generated image is. Clearly, it is funto try such a task.

In this paper, we propose a framework called Reshape-GAN that can automatically learn to do such a reshaping,of which some exemplar results are shown in Fig. 1. Re-shapeGAN not only works on faces/bodies, but also ap-plies to other objects like cats.

ReshapeGAN is a type of Generative Adversarial Net-work (GAN), which was first introduced by Goodfellowet. al. in 2014 [19]. GAN has been developed and provedto be very effective on image generation tasks for manyapplications [44, 79, 58, 74, 80, 25, 71, 52, 10, 23, 35, 39,29]. Generally, conditionally generating images could fallinto two categories: supervised image generation and un-supervised image-to-image translation. The former canusually generate higher quality results [44, 68, 1], whichare more realistic [24] or with higher resolution [64], butit has the limitation of relying on paired training data(input and ground truth pairs) which may be hard orimpossible to collect in many applications. The latterdoesn’t require such paired training data and thus beingmore widely applicable. Though unsupervised image-to-image translation is more challenging, its merits have at-tracted researchers to make a lot of progresses includingCycleGAN [80], DualGAN [71], DiscoGAN [25], MU-NIT [23], DRIT [35], StarGAN [10], etc [22]. Specially,the cross-domain or cross-dataset unsupervised image-to-image translation is most challenging, as each domain ordataset has its own style and attributes. Image generationhave to manipulate and control such styles and attributes

for a desired output [44, 79, 80, 25, 71, 52, 10, 23, 35].The proposed ReshapeGAN learns to preserve the appear-ance of the input object instance and get the shape infor-mation from the reference image for the reshaping tasks.Its framework can be tailored for both supervised imagegeneration and unsupervised image-to-image translation,within a domain or across domains/datasets, as shownin Fig 2. As far as we are aware, this is the first time thatobject reshaping is made possible for all these settings,especially for the cross-domain/cross-dataset setting.

Usually, cross-domain image-to-image translation re-quires a large amount of training data from each individ-ual for ensuring a reasonably good performance, due tothe possibly large style and attribute differences betweendomains/datasets. Nevertheless, ReshapeGAN is madeeffective for working with relatively small amount of datafrom each individual, so that being as generally applicableas possible.

To demonstrate the effectiveness and superiority of Re-shapeGAN, we conduct extensive experiments for eachof the three main settings, resulting in totally 8 differenttasks on 13 datasets. Since there is no unique metric that iswidely recognized as the standard for generated image’squality assessment, we adopt all the applicable metricsthat we are aware of and use them for performance eval-uation and comparison. We compare ReshapeGAN withstate-of-the-art methods on all the tasks which they can beapplied to or made applicable for, and do ablation studiesto verify the effectiveness of each component of Reshape-GAN.

In brief, the main contributions of this work can besummarized as follows.

• It presents as far as we know the first generalreference-guided object reshaping framework, whichpreserves the object’s identity and works on most di-verse input-reference pairs.

• It introduces tailored models of the proposed frame-work, which are applicable to both supervised andunsupervised reshaping, for both within-domain andcross-domain or cross-dataset scenarios.

• Extensive experiments have been done on all the set-tings with many tasks and datasets, to show the ef-fectiveness of the proposal with ablation studies and

2

Page 3: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Imagemanifold

Shapespace Reshaped

images

Shapeextraction

Reshaping

(a)

Shapespace

Shapeextraction

Reshapedimages

Reshaping

(b)

(c)

Reshaping Reshaping

Reshapedimages

Shapespace

Shapeextraction Reshaped

images

Shapespace

Shapeextraction

Domain A Domain B

Imagemanifold

Imagemanifold

Imagemanifold

Figure 2: The typical settings that ReshapeGAN handles: (a) reshaping by within-domain shape guidance with paireddata, (b) reshaping by within-domain shape guidance with unpaired data, and (c) reshaping by cross-domain shapeguidance.

its significant superiority in comparison with state-of-the-art methods when they are or are made to becomparable.

2 Related work

2.1 Generative Adversarial Networks(GANs)

Thanks to the development of GAN [19], the recent exten-sions of GAN have achieved a great progress in variousvision tasks such as image editing and in-painting [17,51, 69, 3], super resolution [34, 9, 59], image restora-tion and enhancement [40, 72], object detection [36, 47]and other applications [12, 79, 74, 43, 55, 46]. Re-garding with adversarial training, how to avoid model

collapse and reduce the training instability is an opendiscussion [39, 11]. To achieve this goal, some stud-ies focus on designing novel network architectures andtraining mechanisms [8, 48] while more studies tried tosolve this problem by updating adversarial loss functionssuch as Boundary-Seeking Generative Adversarial Net-works [13], Wasserstein Generative Adversarial Networks(WGAN) [2], Loss-Sensitive Generative Adversarial Net-works (LS-GAN) [53], Least Square Generative Adver-sarial Networks (LSGAN) [43], etc [29]. In our tasks, wealso include some efficient generative adversarial losses toimprove the training process and yield more plausible re-sults. We take advantage of WGAN [2] and its extensionWGAN-GP [20] to make the adversarial training processmore stable. Furthermore, we use Perturbed loss fromDRAGAN [28] to improve the synthetic quality.

3

Page 4: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

2.2 GAN based image-to-image translation

Benefited from the success of conditional GANs(cGANs) [44], many researches related with GANs fo-cused on image-to-image translation tasks. Accordingto the data type, these tasks can be roughly divided intopaired [24, 64, 81] and unpaired [80, 71, 60, 37, 25, 54]image-to-image translation. Paired image-to-image trans-lation approaches usually require the paired images fortraining. They always need one or more pixel-wise re-striction such as L1 norm to implement supervised learn-ing with data pairs [24]. However, paired data are expen-sive and sometimes impossible to be collected. To over-come this shortage, CycleGAN, DualGAN and Disco-GAN applied a cycle consistency to translate images us-ing unpaired images [80, 25, 71]. Besides, Choi et al.proposed StarGAN [10] that used a single model and la-tent code to handle one-to-many image-to-image transla-tion tasks. Similar to ACGAN [48], which trained thegenerator with the discriminator that combines with anauxiliary classifier, StarGAN inherited the structure andthe cycle consistency loss from CycleGAN and used anadditional classifier to control the synthesize attribute.Besides, combining variational autoencoders (VAE) [27]with GANs is another important branch to translate im-ages between different domains [32] with unpaired data.Fader Networks [30] used the attribute-invariant represen-tations, encoded by the input image, and the latent codefor image reconstruction. Zhu et al. proposed Bicycle-GAN [81], which combines VAE-GAN [32] objects andlatent regressor objects [14, 16] for a bijective consistencyto obtain more realistic and diverse samples using paireddata. These studies have advanced the development of theone-to-one image-to-image translation. However, no ex-isting work can achieve identity-preserved object reshap-ing for both the case of training with paired data and thatwith only unpaired data across domains/datasets.

2.3 Conditional image editing

Conditional generative models are widely used for imagesynthesize under one or more given conditions. Manystudies were developed based on two pioneer works: con-ditional variantional Autoencoder (CVAE) [68] and con-ditional Generative Adversarial Network (cGAN) [44].Lassner et al. proposed a conditional architecture hy-

bridizing VAE with adversarial training to generated full-body people in clothing [33]. Reed et al. [55] presented agenerative model to generate bird and flower images con-ditioned on text descriptions by adding textual informa-tion to both generator and discriminator. They further dis-cussed the feasibility to control the features, structure andthe locations of the generated images with different condi-tional text-to-image models [57, 56]. StackGAN [74] andits extension StackGAN++ [73] can also use text descrip-tions to generate high-quality photo-realistic images. Asmentioned before, StarGAN [10] used one-hot encodedbiases as conditions to translate images from one domainto another with unpaired data.

Comparing with these works which use labels and textsas the conditions for image generation, using multiple im-ages as conditions for image synthesize is more challeng-ing. Ma et al. [42] considered both pose images and per-son images as conditions to guide the network to generatea person image with a specified pose. Yang et al. [70]further extended this idea to generate videos under theconstraint of pose series. Zhao et al. [78] explored gen-erating multi-view cloth images from only a single viewinput, while Ma et al. [41] and Park et al. [50] used anextra image as exemplar for semantic-preserved unsuper-vised translation, which have a similar motivation to ourtask. However, most of image guided image editing ap-proaches rely on paired images to implement a pixel-wisesupervised training, or limit to some low-level (color, tex-ture, etc.) translation applications. Unlike those methods,our proposed ReshapeGAN can work on unpaired train-ing data for high-level object reshaping tasks. We achievethis by making use of shape and appearance informationin a more efficient and flexible way. Our model can gen-erate a desired image that preserves the appearance of aspecific object instance while borrowing the shape infor-mation from another image, even across domains.

3 Network Architecture for ObjectReshaping

Given an input image x ∈ X and a reference image y ∈ Yfor geometric (shape) guidance represented by sy , ob-ject reshaping targets at learning a generator G which cangenerate a new image G(x, sy) inheriting x’s appearance

4

Page 5: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

while at the same time changing to y’s shape sy . Our pro-posal for object reshaping is called ReshapeGAN, whichcan be tailored for three typical settings (Fig. 2). We intro-duce each of the settings and our corresponding tailoredmodel in the following subsections.

3.1 Reshaping by within-domain guidancewith paired data

When the reference image is in the same domain (whichusually means the same dataset or style) as the input im-age and the reference image is about the same object in-stance (i.e., having the same identity) as the one in the in-put image, we get the easiest setting for object reshaping,which has paired training data. In this case, ReshapeGANcan be learned by solving the following problem:

G∗ = arg minG

maxDLpairedReshapeGAN (G,D), (1)

with

LpairedReshapeGAN (G,D) = Ls

adv(G,D) + γLperturb(D)

+ δLpairedpixel (G) + σLpaired

percep(G),(2)

where D is its discriminator; Lsadv(G,D) is the adversar-

ial loss with shape guidance; Lperturb(D) is the perturbedloss for regularizing D; Lpaired

pixel (G) is the paired pixel-level appearance matching loss; Lpaired

percep(G) denotes theperceptual loss; γ, δ, and σ are super-parameters for bal-ancing the corresponding losses. Details of the loss func-tions are explained as follows and the overall model isillustrated in Fig. 3.

Shape space

Image manifold

Shapeextraction …

Concaty, sy

<latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit>

G(x, sy), sy<latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit>

Concat

Lpairedpercep(G)

<latexit sha1_base64="mUg2LPJeREyHlVV9M0ity9/d5kU=">AAACCXicbVDNS8MwHE3n15xfVY9egkOYl9GKoMeBBz14mOA+YKslTdMtLE1DkgqjFDx58V/x4kERr/4H3vxvTLcddPNB4PHeL8nvvUAwqrTjfFulpeWV1bXyemVjc2t7x97da6sklZi0cMIS2Q2QIoxy0tJUM9IVkqA4YKQTjC4Kv3NPpKIJv9VjQbwYDTiNKEbaSL4N+zHSQ4xYdp37mSDmVZHfZQJRScK8dnns21Wn7kwAF4k7I1UwQ9O3v/phgtOYcI0ZUqrnOkJ7GZKaYkbySj9VRCA8QgPSM5SjmCgvmyTJ4ZFRQhgl0hyu4UT9fSNDsVLjODCTxd5q3ivE/7xeqqNzL6NcpJpwPP0oShnUCSxqgaHJizUbG4KwpGZXiIdIIqxNeRVTgjsfeZG0T+quU3dvTquN9sO0jjI4AIegBlxwBhrgCjRBC2DwCJ7BK3iznqwX6936mI6WrFmF++APrM8fXkGbMg==</latexit><latexit sha1_base64="mUg2LPJeREyHlVV9M0ity9/d5kU=">AAACCXicbVDNS8MwHE3n15xfVY9egkOYl9GKoMeBBz14mOA+YKslTdMtLE1DkgqjFDx58V/x4kERr/4H3vxvTLcddPNB4PHeL8nvvUAwqrTjfFulpeWV1bXyemVjc2t7x97da6sklZi0cMIS2Q2QIoxy0tJUM9IVkqA4YKQTjC4Kv3NPpKIJv9VjQbwYDTiNKEbaSL4N+zHSQ4xYdp37mSDmVZHfZQJRScK8dnns21Wn7kwAF4k7I1UwQ9O3v/phgtOYcI0ZUqrnOkJ7GZKaYkbySj9VRCA8QgPSM5SjmCgvmyTJ4ZFRQhgl0hyu4UT9fSNDsVLjODCTxd5q3ivE/7xeqqNzL6NcpJpwPP0oShnUCSxqgaHJizUbG4KwpGZXiIdIIqxNeRVTgjsfeZG0T+quU3dvTquN9sO0jjI4AIegBlxwBhrgCjRBC2DwCJ7BK3iznqwX6936mI6WrFmF++APrM8fXkGbMg==</latexit><latexit sha1_base64="mUg2LPJeREyHlVV9M0ity9/d5kU=">AAACCXicbVDNS8MwHE3n15xfVY9egkOYl9GKoMeBBz14mOA+YKslTdMtLE1DkgqjFDx58V/x4kERr/4H3vxvTLcddPNB4PHeL8nvvUAwqrTjfFulpeWV1bXyemVjc2t7x97da6sklZi0cMIS2Q2QIoxy0tJUM9IVkqA4YKQTjC4Kv3NPpKIJv9VjQbwYDTiNKEbaSL4N+zHSQ4xYdp37mSDmVZHfZQJRScK8dnns21Wn7kwAF4k7I1UwQ9O3v/phgtOYcI0ZUqrnOkJ7GZKaYkbySj9VRCA8QgPSM5SjmCgvmyTJ4ZFRQhgl0hyu4UT9fSNDsVLjODCTxd5q3ivE/7xeqqNzL6NcpJpwPP0oShnUCSxqgaHJizUbG4KwpGZXiIdIIqxNeRVTgjsfeZG0T+quU3dvTquN9sO0jjI4AIegBlxwBhrgCjRBC2DwCJ7BK3iznqwX6936mI6WrFmF++APrM8fXkGbMg==</latexit><latexit sha1_base64="mUg2LPJeREyHlVV9M0ity9/d5kU=">AAACCXicbVDNS8MwHE3n15xfVY9egkOYl9GKoMeBBz14mOA+YKslTdMtLE1DkgqjFDx58V/x4kERr/4H3vxvTLcddPNB4PHeL8nvvUAwqrTjfFulpeWV1bXyemVjc2t7x97da6sklZi0cMIS2Q2QIoxy0tJUM9IVkqA4YKQTjC4Kv3NPpKIJv9VjQbwYDTiNKEbaSL4N+zHSQ4xYdp37mSDmVZHfZQJRScK8dnns21Wn7kwAF4k7I1UwQ9O3v/phgtOYcI0ZUqrnOkJ7GZKaYkbySj9VRCA8QgPSM5SjmCgvmyTJ4ZFRQhgl0hyu4UT9fSNDsVLjODCTxd5q3ivE/7xeqqNzL6NcpJpwPP0oShnUCSxqgaHJizUbG4KwpGZXiIdIIqxNeRVTgjsfeZG0T+quU3dvTquN9sO0jjI4AIegBlxwBhrgCjRBC2DwCJ7BK3iznqwX6936mI6WrFmF++APrM8fXkGbMg==</latexit>

Lperturb(D)<latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit>

Lsadv(D)

<latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit>

Lpairedpixel (G)

<latexit sha1_base64="98SI+DMsHDubs57u2XPGqtDgmzk=">AAACCHicbVDLSgMxFM3UV62vUZcuDBahbsqMCLosuNCFiwr2Ae04ZDJpG5rJhCQjlmHAjRt/xY0LRdz6Ce78GzNtF9p6IHA454Z7zwkEo0o7zrdVWFhcWl4prpbW1jc2t+ztnaaKE4lJA8cslu0AKcIoJw1NNSNtIQmKAkZawfA891t3RCoa8xs9EsSLUJ/THsVIG8m397sR0gOMWHqV+amg94Rlt6lAVJIwq1wc+XbZqTpjwHniTkkZTFH37a9uGOMkIlxjhpTquI7QXoqkppiRrNRNFBEID1GfdAzlKCLKS8dBMnholBD2Ymke13Cs/v6RokipURSYyfxsNevl4n9eJ9G9My+lXCSacDxZ1EsY1DHMW4GhyYs1GxmCsKTmVogHSCKsTXclU4I7G3meNI+rrlN1r0/KtebDpI4i2AMHoAJccApq4BLUQQNg8AiewSt4s56sF+vd+piMFqxphbvgD6zPH5/9mss=</latexit><latexit sha1_base64="98SI+DMsHDubs57u2XPGqtDgmzk=">AAACCHicbVDLSgMxFM3UV62vUZcuDBahbsqMCLosuNCFiwr2Ae04ZDJpG5rJhCQjlmHAjRt/xY0LRdz6Ce78GzNtF9p6IHA454Z7zwkEo0o7zrdVWFhcWl4prpbW1jc2t+ztnaaKE4lJA8cslu0AKcIoJw1NNSNtIQmKAkZawfA891t3RCoa8xs9EsSLUJ/THsVIG8m397sR0gOMWHqV+amg94Rlt6lAVJIwq1wc+XbZqTpjwHniTkkZTFH37a9uGOMkIlxjhpTquI7QXoqkppiRrNRNFBEID1GfdAzlKCLKS8dBMnholBD2Ymke13Cs/v6RokipURSYyfxsNevl4n9eJ9G9My+lXCSacDxZ1EsY1DHMW4GhyYs1GxmCsKTmVogHSCKsTXclU4I7G3meNI+rrlN1r0/KtebDpI4i2AMHoAJccApq4BLUQQNg8AiewSt4s56sF+vd+piMFqxphbvgD6zPH5/9mss=</latexit><latexit sha1_base64="98SI+DMsHDubs57u2XPGqtDgmzk=">AAACCHicbVDLSgMxFM3UV62vUZcuDBahbsqMCLosuNCFiwr2Ae04ZDJpG5rJhCQjlmHAjRt/xY0LRdz6Ce78GzNtF9p6IHA454Z7zwkEo0o7zrdVWFhcWl4prpbW1jc2t+ztnaaKE4lJA8cslu0AKcIoJw1NNSNtIQmKAkZawfA891t3RCoa8xs9EsSLUJ/THsVIG8m397sR0gOMWHqV+amg94Rlt6lAVJIwq1wc+XbZqTpjwHniTkkZTFH37a9uGOMkIlxjhpTquI7QXoqkppiRrNRNFBEID1GfdAzlKCLKS8dBMnholBD2Ymke13Cs/v6RokipURSYyfxsNevl4n9eJ9G9My+lXCSacDxZ1EsY1DHMW4GhyYs1GxmCsKTmVogHSCKsTXclU4I7G3meNI+rrlN1r0/KtebDpI4i2AMHoAJccApq4BLUQQNg8AiewSt4s56sF+vd+piMFqxphbvgD6zPH5/9mss=</latexit><latexit sha1_base64="98SI+DMsHDubs57u2XPGqtDgmzk=">AAACCHicbVDLSgMxFM3UV62vUZcuDBahbsqMCLosuNCFiwr2Ae04ZDJpG5rJhCQjlmHAjRt/xY0LRdz6Ce78GzNtF9p6IHA454Z7zwkEo0o7zrdVWFhcWl4prpbW1jc2t+ztnaaKE4lJA8cslu0AKcIoJw1NNSNtIQmKAkZawfA891t3RCoa8xs9EsSLUJ/THsVIG8m397sR0gOMWHqV+amg94Rlt6lAVJIwq1wc+XbZqTpjwHniTkkZTFH37a9uGOMkIlxjhpTquI7QXoqkppiRrNRNFBEID1GfdAzlKCLKS8dBMnholBD2Ymke13Cs/v6RokipURSYyfxsNevl4n9eJ9G9My+lXCSacDxZ1EsY1DHMW4GhyYs1GxmCsKTmVogHSCKsTXclU4I7G3meNI+rrlN1r0/KtebDpI4i2AMHoAJccApq4BLUQQNg8AiewSt4s56sF+vd+piMFqxphbvgD6zPH5/9mss=</latexit>

y<latexit sha1_base64="thLS80HyLWaSgIUEpwOdLuHenrM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoiMsW7ANqkWQ6rUPzIjMRStEfcKvfJv6B/oV3xhTUIjohyZlz7zkz914/CYRUjvNasBYWl5ZXiqultfWNza3y9k5bxlnKeIvFQZx2fU/yQES8pYQKeDdJuRf6Ae/443Md79zxVIo4ulKThPdDbxSJoWCeIqo5uSlXnKpjlj0P3BxUkK9GXH7BNQaIwZAhBEcERTiAB0lPDy4cJMT1MSUuJSRMnOMeJdJmlMUpwyN2TN8R7Xo5G9Fee0qjZnRKQG9KShsHpIkpLyWsT7NNPDPOmv3Ne2o89d0m9Pdzr5BYhVti/9LNMv+r07UoDHFqahBUU2IYXR3LXTLTFX1z+0tVihwS4jQeUDwlzIxy1mfbaKSpXffWM/E3k6lZvWd5boZ3fUsasPtznPOgXau6R9Va87hSP8tHXcQe9nFI8zxBHZdooGW8H/GEZ+vCCixpZZ+pViHX7OLbsh4+AG7Kj4I=</latexit>

Ground truth

x<latexit sha1_base64="dWAOxaM1Wan6zmgdHuTSuCpjqgQ=">AAACxXicjVHLSsNAFD2Nr1pfVZdugkVwVdIq6LLoQpdV7ANqkWQ6rUPzIpkUSxF/wK3+mvgH+hfeGaegFtEJSc6ce8+Zufd6sS9S6TivOWtufmFxKb9cWFldW98obm410yhLGG+wyI+Stuem3Bchb0ghfd6OE+4Gns9b3vBUxVsjnqQiCq/kOObdwB2Eoi+YK4m6vCvcFEtO2dHLngUVA0owqx4VX3CNHiIwZAjAEUIS9uEipaeDChzExHUxIS4hJHSc4x4F0maUxSnDJXZI3wHtOoYNaa88U61mdIpPb0JKG3ukiSgvIaxOs3U8086K/c17oj3V3cb094xXQKzELbF/6aaZ/9WpWiT6ONY1CKop1oyqjhmXTHdF3dz+UpUkh5g4hXsUTwgzrZz22daaVNeueuvq+JvOVKzaM5Ob4V3dkgZc+TnOWdCslisH5erFYal2Ykadxw52sU/zPEIN56ijQd59POIJz9aZFVjSGn2mWjmj2ca3ZT18AKpij5U=</latexit>

Input

sy<latexit sha1_base64="U5GjLwVr/t5269faQYAKi5fovvQ=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LLopsuK9gG1lGQ6rUPTJEwmSimCP+BWP038A/0L74wpqEV0QpIz595zZu69fhyIRDnOa85aWFxaXsmvFtbWNza3its7zSRKJeMNFgWRbPtewgMR8oYSKuDtWHJv7Ae85Y/Odbx1y2UiovBKTWLeHXvDUAwE8xRRl0lv0iuWnLJjlj0P3AyUkK16VHzBNfqIwJBiDI4QinAADwk9HbhwEBPXxZQ4SUiYOMc9CqRNKYtThkfsiL5D2nUyNqS99kyMmtEpAb2SlDYOSBNRniSsT7NNPDXOmv3Ne2o89d0m9PczrzGxCjfE/qWbZf5Xp2tRGODU1CCoptgwujqWuaSmK/rm9peqFDnExGncp7gkzIxy1mfbaBJTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOOdBs1J2j8qVi+NS9SwbdR572MchzfMEVdRQR4O8h3jEE56tmhVaqXX3mWrlMs0uvi3r4QOub5Bo</latexit>

Reference

VGG-19

G<latexit sha1_base64="62MSal9IrrJAByTXj3PzAZZ4ydM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoqMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPL8pV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfrzBKWtLLPVKuQa7bxbVkPH/f7j1A=</latexit>

D<latexit sha1_base64="Ua3ZUDwv3oP+zXDWjFsiLRMRAws=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LKoiMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPLspV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfr3BKWtLLPVKuQa7bxbVkPH/Dbj00=</latexit>

Figure 3: The detailed ReshapeGAN model for reshapingby within-domain guidance with paired data.

Shape guided adversarial loss Lsadv(G,D). Based on

our shape guided generator G and following the com-mon definition of adversarial loss, we have Ls

adv(G,D)

defined as

Lsadv(G,D) =Ey [logD (y, sy)]

+ Ex,y [log(1−D(G(x, sy), sy)] .(3)

The adversarial loss is a basic requirement in an adversar-ial network, which makes sure that the generated imageG(x, sy) cannot be distinguished from the reference im-age y by the discriminator D, in terms of the shape infor-mation, but not the appearance as in existing works.Perturbed loss Lperturb(D). To further improve the ro-bustness of the discriminator, we introduce to our modelthe perturbed loss originated from Gulrajani et al.’s DRA-GAN work [28]. The perturbed loss can be expressed as

Lperturb(D) = Ex,y

[(‖5xD(x, sy)‖2 − 1)2

],

with x = (1− α)x+ αz,where α ∼ U [0, 1].(4)

Here x represents a real image sample and z means therandom noise, which follows the Gaussian distribution; αis the random hyper-parameter that controls the balancebetween real and noise and follows the continuous uni-form distribution between 0 and 1; 5x is the gradient ofD(x, sy). Please note that x is only used for calculat-ing Lperturb(D). We believe that such a perturbed lossmakes the convergence more stable and generate imageswith higher quality. The experiment results in Section 4also support this point.Paired pixel-level appearance matching lossLpixel(G).It is a widely used loss for ensuring that the generated im-age matches the ground truth image or has a desire appear-ance at the pixel-level, and usually the L1 norm is used.When paired training data is available, the loss is simpleas:

Lpairedpixel (G) = Ex,y

[‖G(x, sy)− y‖1

], (5)

which is about the generation loss w.r.t. the providedground truth data y.Paired perceptual loss Lpaired

percep(G). Unlike the pixel-level loss which matches two images pixel by pixel, theperceptual loss measures the similarly at the feature level.In greater details, we follow Chen et al.’s work [7] and in-clude an extra pre-trained VGG-19 network on ImageNetto compute the perceptual loss [34, 15, 7]. Compared topixel-level loss, this loss combines the distance at mul-tiple scales of feature representation, which could carry

5

Page 6: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

low-level and high-level information of images. The per-ceptual loss is described as:

Lpairedpercep(G) =

N∑n

λnEx,y [||Φn(y)− Φn(G(x, sy))||1] ,

(6)where Φn is the feature extractor at the nth level of thepre-trained VGG-19 network. Following [7], we computethe perceptual loss between outputs at defined N = 5selective layers. The weights of pre-trained model willnot be optimized during the learning process. The hyper-parameter λn controls the influence of perceptual loss atdifferent scales. The perceptual loss from the higher layercontrols the global structure, and the loss from the lowlayer controls the local details during generation. Thus,the generator should provide better synthesized imagesto cheat this hybrid discriminator and finally improve thesynthesized image quality. Please note, only for the pairedtraining, we compute the perceptual loss between the tar-get images and generated images. The detail informationof the proposed method for reshaping by within-domainguidance with paired data could be found in Fig. 3.

3.2 Reshaping by within-domain guidancewith unpaired data

For the case that the reference image is in the same do-main as the input image but they are not about the same in-stance/identity, the model has to be learned with unpairedtraining data. In this case, we use a model similar to theone introduced in Section 3.1, but having the pixel-levelappearance matching loss and perceptual loss in their un-paired version. In great detail, the overall loss functionbecomes

LunpairedReshapeGAN (G,D) = Ls

adv(G,D) + γLperturb(D)

+ δLunpairedpixel (G) + σLunpaired

percep (G),(7)

where the unpaired version of pixel-level appear-ance matching loss Lunpaired

pixel (G) and perceptual lossLunpairedpercep (G) are defined below, and this ReshapeGAN

model is illustrated in Fig. 4.Unpaired pixel-level appearance matching lossLunpairedpixel (G). When there is no paired data for training,

inspired by CycleGAN [80], we use the cycle consistency

Shape space

Image manifold

Shapeextraction

…G

<latexit sha1_base64="r59gyIwTp67F5UULpG0/vt117Y8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPBgx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHpGGM5w==</latexit><latexit sha1_base64="r59gyIwTp67F5UULpG0/vt117Y8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPBgx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHpGGM5w==</latexit><latexit sha1_base64="r59gyIwTp67F5UULpG0/vt117Y8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPBgx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHpGGM5w==</latexit><latexit sha1_base64="r59gyIwTp67F5UULpG0/vt117Y8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPBgx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHpGGM5w==</latexit> D<latexit sha1_base64="g3n7ocFrZANakX7SmpEBrEX/HhI=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHn9WM5A==</latexit><latexit sha1_base64="g3n7ocFrZANakX7SmpEBrEX/HhI=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHn9WM5A==</latexit><latexit sha1_base64="g3n7ocFrZANakX7SmpEBrEX/HhI=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHn9WM5A==</latexit><latexit sha1_base64="g3n7ocFrZANakX7SmpEBrEX/HhI=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx5bsB/QxrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfjm5nffkKleSzvzSRBP6JDyUPOqLFS47ZfrrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmpPeRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHn9WM5A==</latexit>

Lperturb(D)<latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit>

Lsadv(D)

<latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit>

Shapeextraction

Concat

G(x, sy), sx<latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit><latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit><latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit><latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit>

y, sy<latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit>

Concat

G(G(x, sy), sx)<latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit><latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit><latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit><latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit>

Lunpairedpercep (G)

<latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit><latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit><latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit><latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit>

Lunpairedpixel (G)

<latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit><latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit><latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit><latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit>

VGG-19y<latexit sha1_base64="thLS80HyLWaSgIUEpwOdLuHenrM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoiMsW7ANqkWQ6rUPzIjMRStEfcKvfJv6B/oV3xhTUIjohyZlz7zkz914/CYRUjvNasBYWl5ZXiqultfWNza3y9k5bxlnKeIvFQZx2fU/yQES8pYQKeDdJuRf6Ae/443Md79zxVIo4ulKThPdDbxSJoWCeIqo5uSlXnKpjlj0P3BxUkK9GXH7BNQaIwZAhBEcERTiAB0lPDy4cJMT1MSUuJSRMnOMeJdJmlMUpwyN2TN8R7Xo5G9Fee0qjZnRKQG9KShsHpIkpLyWsT7NNPDPOmv3Ne2o89d0m9Pdzr5BYhVti/9LNMv+r07UoDHFqahBUU2IYXR3LXTLTFX1z+0tVihwS4jQeUDwlzIxy1mfbaKSpXffWM/E3k6lZvWd5boZ3fUsasPtznPOgXau6R9Va87hSP8tHXcQe9nFI8zxBHZdooGW8H/GEZ+vCCixpZZ+pViHX7OLbsh4+AG7Kj4I=</latexit>

sy<latexit sha1_base64="U5GjLwVr/t5269faQYAKi5fovvQ=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LLopsuK9gG1lGQ6rUPTJEwmSimCP+BWP038A/0L74wpqEV0QpIz595zZu69fhyIRDnOa85aWFxaXsmvFtbWNza3its7zSRKJeMNFgWRbPtewgMR8oYSKuDtWHJv7Ae85Y/Odbx1y2UiovBKTWLeHXvDUAwE8xRRl0lv0iuWnLJjlj0P3AyUkK16VHzBNfqIwJBiDI4QinAADwk9HbhwEBPXxZQ4SUiYOMc9CqRNKYtThkfsiL5D2nUyNqS99kyMmtEpAb2SlDYOSBNRniSsT7NNPDXOmv3Ne2o89d0m9PczrzGxCjfE/qWbZf5Xp2tRGODU1CCoptgwujqWuaSmK/rm9peqFDnExGncp7gkzIxy1mfbaBJTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOOdBs1J2j8qVi+NS9SwbdR572MchzfMEVdRQR4O8h3jEE56tmhVaqXX3mWrlMs0uvi3r4QOub5Bo</latexit>

x<latexit sha1_base64="dWAOxaM1Wan6zmgdHuTSuCpjqgQ=">AAACxXicjVHLSsNAFD2Nr1pfVZdugkVwVdIq6LLoQpdV7ANqkWQ6rUPzIpkUSxF/wK3+mvgH+hfeGaegFtEJSc6ce8+Zufd6sS9S6TivOWtufmFxKb9cWFldW98obm410yhLGG+wyI+Stuem3Bchb0ghfd6OE+4Gns9b3vBUxVsjnqQiCq/kOObdwB2Eoi+YK4m6vCvcFEtO2dHLngUVA0owqx4VX3CNHiIwZAjAEUIS9uEipaeDChzExHUxIS4hJHSc4x4F0maUxSnDJXZI3wHtOoYNaa88U61mdIpPb0JKG3ukiSgvIaxOs3U8086K/c17oj3V3cb094xXQKzELbF/6aaZ/9WpWiT6ONY1CKop1oyqjhmXTHdF3dz+UpUkh5g4hXsUTwgzrZz22daaVNeueuvq+JvOVKzaM5Ob4V3dkgZc+TnOWdCslisH5erFYal2Ykadxw52sU/zPEIN56ijQd59POIJz9aZFVjSGn2mWjmj2ca3ZT18AKpij5U=</latexit>

G<latexit sha1_base64="62MSal9IrrJAByTXj3PzAZZ4ydM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoqMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPL8pV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfrzBKWtLLPVKuQa7bxbVkPH/f7j1A=</latexit>sy

<latexit sha1_base64="U5GjLwVr/t5269faQYAKi5fovvQ=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LLopsuK9gG1lGQ6rUPTJEwmSimCP+BWP038A/0L74wpqEV0QpIz595zZu69fhyIRDnOa85aWFxaXsmvFtbWNza3its7zSRKJeMNFgWRbPtewgMR8oYSKuDtWHJv7Ae85Y/Odbx1y2UiovBKTWLeHXvDUAwE8xRRl0lv0iuWnLJjlj0P3AyUkK16VHzBNfqIwJBiDI4QinAADwk9HbhwEBPXxZQ4SUiYOMc9CqRNKYtThkfsiL5D2nUyNqS99kyMmtEpAb2SlDYOSBNRniSsT7NNPDXOmv3Ne2o89d0m9PczrzGxCjfE/qWbZf5Xp2tRGODU1CCoptgwujqWuaSmK/rm9peqFDnExGncp7gkzIxy1mfbaBJTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOOdBs1J2j8qVi+NS9SwbdR572MchzfMEVdRQR4O8h3jEE56tmhVaqXX3mWrlMs0uvi3r4QOub5Bo</latexit>

sx<latexit sha1_base64="znr6jpBDjQP2S81gGIfEvIHxXiY=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1qFpEjITtRTBH3Crnyb+gf6Fd8YU1CI6IcmZc+85M/dePw6EVI7zmrPm5hcWl/LLhZXVtfWN4uZWQ0ZpwnidRUGUtHxP8kCEvK6ECngrTrg38gPe9IdnOt684YkUUXipxjHvjLxBKPqCeYqoC9m96xZLTtkxy54FbgZKyFYtKr7gCj1EYEgxAkcIRTiAB0lPGy4cxMR1MCEuISRMnOMeBdKmlMUpwyN2SN8B7doZG9Jee0qjZnRKQG9CSht7pIkoLyGsT7NNPDXOmv3Ne2I89d3G9PczrxGxCtfE/qWbZv5Xp2tR6OPE1CCoptgwujqWuaSmK/rm9peqFDnExGnco3hCmBnltM+20UhTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOGdB46DsHpYPzo9KldNs1HnsYBf7NM9jVFBFDXXyHuART3i2qlZopdbtZ6qVyzTb+Lashw+sD5Bn</latexit>

G(x, sy), sy<latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit>

Figure 4: The detailed ReshapeGAN model for reshapingby within-domain guidance with unpaired data.

reconstruction loss with two rounds of reshaping guidedby sx and sy (shapes of x and y, respectively) as theunpaired pixel-level appearance matching loss:

Lunpairedpixel (G) = Ex,y

[‖G(G(x, sy), sx)− x‖1

]. (8)

Unpaired perceptual loss Lunpairedpercep (G). In the case that

there is no paired sample with the given input x, it is rea-sonable to assume that the generated image G(x, sy) hasthe same perceptual response as that of x due to the de-sired appearance and identity preserving property, thosethey shall have different shapes (sy vs. sx). Unlikethe pixel-wise matching loss which requires good align-ment and thus minimum shape difference, perceptual lossis about feature-level perception results, which can havesome robustness to shape changes. Therefore, we definethe unpaired perceptual loss to be the that between thegenerated image G(x, sy) and the input image x:

Lunpairedpercep (G) =

N∑n

λnEx,y [||Φn(x)− Φn(G(x, sy))||1] ,

(9)where Φn is the feature extractor at the nth level of thepre-trained VGG-19 network, the same as the one inEquation 6.

3.3 Reshaping by cross-domain guidanceThe hardest setting for object reshaping is about cross-domain guidance, i.e., the case when the reference imageand input images are from two different domains (e.g. twodifferent styles or datasets). For this setting, we found thatit is hard for the unpaired ReshapeGAN model introducedin Section 3.2 to learn stable appearance preserved objectreshaping due to the possibly large domain differences.Therefore, we propose a two-stage strategy for dividing

6

Page 7: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Multiple datasets

Shape space

Shapeextraction

Shapeextraction

… …

Concat

G(G(x, sy), sx)<latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit><latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit><latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit><latexit sha1_base64="q07bmyCvFewr2f88yRhtIp+kIR4=">AAAB/HicbVDLSsNAFJ34rPUV7dLNYBFakJKIoMuCi7qSCvYBbQiT6aQdOpmEmYk0hPgrblwo4tYPceffOGmz0NYDl3s4517mzvEiRqWyrG9jbX1jc2u7tFPe3ds/ODSPjrsyjAUmHRyyUPQ9JAmjnHQUVYz0I0FQ4DHS86Y3ud97JELSkD+oJCJOgMac+hQjpSXXrLRqrdrsXLppktXzNsvqrlm1GtYccJXYBamCAm3X/BqOQhwHhCvMkJQD24qUkyKhKGYkKw9jSSKEp2hMBppyFBDppPPjM3imlRH0Q6GLKzhXf2+kKJAyCTw9GSA1kcteLv7nDWLlXzsp5VGsCMeLh/yYQRXCPAk4ooJgxRJNEBZU3wrxBAmElc6rrEOwl7+8SroXDdtq2PeX1eZdEUcJnIBTUAM2uAJNcAvaoAMwSMAzeAVvxpPxYrwbH4vRNaPYqYA/MD5/AF4hk/o=</latexit>

… …

Stage 1

Lperturb(D)<latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit><latexit sha1_base64="sE8ZjvMLLAz7ZK89OQE5Q7TX1lQ=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL4tFqJeSiKDHgh48eKhgP6ANYbPdtks3m7C7EUqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzL4g5U9pxvq3C0vLK6lpxvbSxubW9Y+/uNVWUSEIbJOKRbAdYUc4EbWimOW3HkuIw4LQVjC5zv3VPpWKRuNPjmHohHgjWZwRrI/n2QTfEekgwT28yP42p1IkMssrViW+XnaozAVok7oyUYYa6b391exFJQio04VipjuvE2kux1IxwmpW6iaIxJiM8oB1DBQ6p8tLJBxk6NkoP9SNpSmg0UX9PpDhUahwGpjO/V817ufif10l0/8JLmYgTTQWZLuonHOkI5XGgHpOUaD42BBPJzK2IDLHERJvQSiYEd/7lRdI8rbpO1b09K9eaj9M4inAIR1ABF86hBtdQhwYQeIBneIU368l6sd6tj2lrwZpFuA9/YH3+AN6Cl5k=</latexit>

Lsadv(D)

<latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit><latexit sha1_base64="+hAtElXpK6PIIf7SSko88otoHiU=">AAACAXicbVBNS8NAEN3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9DGMNls26WbTdjdFEqICP4VLx4U8eq/8Oa/cdP2oK0PBh7vzTAzz48Zlcq2v43C0vLK6lpxvbSxubW9Y+7uNWWUCEwaOGKRaPsgCaOcNBRVjLRjQSD0GWn5w8vcb42IkDTid2ocEzeEPqc9ikFpyTMPuiGoAQaW3mReCsEou09lVrk68cyyXbUnsBaJMyNlNEPdM7+6QYSTkHCFGUjZcexYuSkIRTEjWambSBIDHkKfdDTlEBLpppMPMutYK4HVi4QurqyJ+nsihVDKcejrzvxeOe/l4n9eJ1G9CzelPE4U4Xi6qJcwS0VWHocVUEGwYmNNAAuqb7XwAARgpUMr6RCc+ZcXSfO06thV5/asXGs+TuMookN0hCrIQeeohq5RHTUQRg/oGb2iN+PJeDHejY9pa8GYRbiP/sD4/AHeKZeZ</latexit>

Lunpairedpercep (G)

<latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit><latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit><latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit><latexit sha1_base64="JwGqTN6OirK1+tMkc3MmITY832I=">AAACC3icbVDNS8MwHE39nPOr6tFL2BDmZbQi6HHgQQ8eJrgP2GpJ03QLS9OQpMIoBY9e/Fe8eFDEq/+AN/8b020H3XwQeLz3S/J7LxCMKu0439bS8srq2nppo7y5tb2za+/tt1WSSkxaOGGJ7AZIEUY5aWmqGekKSVAcMNIJRheF37knUtGE3+qxIF6MBpxGFCNtJN+u9GOkhxix7Dr3M0HMqyK/y1IuEJUkzGuXx75dderOBHCRuDNSBTM0ffurHyY4jQnXmCGleq4jtJchqSlmJC/3U0UEwiM0ID1DOYqJ8rJJlhweGSWEUSLN4RpO1N83MhQrNY4DM1lsrua9QvzP66U6OvcyykWqCcfTj6KUQZ3AohgYmrxYs7EhCEtqdoV4iCTC2tRXNiW485EXSfuk7jp19+a02mg/TOsogUNQATXggjPQAFegCVoAg0fwDF7Bm/VkvVjv1sd0dMmaVXgA/sD6/AEclJwp</latexit>

Lunpairedpixel (G)

<latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit><latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit><latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit><latexit sha1_base64="Z6gdo9ut60i5hry23LXsgn5Qdqg=">AAACCnicbVDLSgMxFM3UV62vUZduokWomzIjgi4LLnThooJ9QDsOmUzahmYyIcmIZRhw58ZfceNCEbd+gTv/xkzbhbYeCBzOueHecwLBqNKO820VFhaXlleKq6W19Y3NLXt7p6niRGLSwDGLZTtAijDKSUNTzUhbSIKigJFWMDzP/dYdkYrG/EaPBPEi1Oe0RzHSRvLt/W6E9AAjll5lfiroPWHZbZpwgagkYVa5OPLtslN1xoDzxJ2SMpii7ttf3TDGSUS4xgwp1XEdob0USU0xI1mpmygiEB6iPukYylFElJeOo2Tw0Cgh7MXSPK7hWP39I0WRUqMoMJP54WrWy8X/vE6ie2deSrlINOF4sqiXMKhjmPcCQ5MXazYyBGFJza0QD5BEWJv2SqYEdzbyPGkeV12n6l6flGvNh0kdRbAHDkAFuOAU1MAlqIMGwOARPINX8GY9WS/Wu/UxGS1Y0wp3wR9Ynz9dkpvC</latexit>

Stage 2

G(G(x, sy), sy), sy<latexit sha1_base64="7oYrTdky232nQB6U3CS0rfLHjI0=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFFXUsE+oA1hMp20QyeTMDMRQyhu/BU3LhRx61e482+ctlnY6oHLPZxzLzP3+DGjUtn2t1FYWl5ZXSuulzY2t7Z3zN29lowSgUkTRywSHR9JwignTUUVI51YEBT6jLT90dXEb98TIWnE71QaEzdEA04DipHSkmce1Cv1ysOp9LJ0fDLXPLNsV+0prL/EyUkZcjQ886vXj3ASEq4wQ1J2HTtWboaEopiRcamXSBIjPEID0tWUo5BIN5ueMLaOtdK3gkjo4sqaqr83MhRKmYa+ngyRGspFbyL+53UTFVy6GeVxogjHs4eChFkqsiZ5WH0qCFYs1QRhQfVfLTxEAmGlUyvpEJzFk/+S1lnVsavO7Xm5dpPHUYRDOIIKOHABNbiGBjQBwyM8wyu8GU/Gi/FufMxGC0a+sw9zMD5/ABuOlqY=</latexit><latexit sha1_base64="7oYrTdky232nQB6U3CS0rfLHjI0=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFFXUsE+oA1hMp20QyeTMDMRQyhu/BU3LhRx61e482+ctlnY6oHLPZxzLzP3+DGjUtn2t1FYWl5ZXSuulzY2t7Z3zN29lowSgUkTRywSHR9JwignTUUVI51YEBT6jLT90dXEb98TIWnE71QaEzdEA04DipHSkmce1Cv1ysOp9LJ0fDLXPLNsV+0prL/EyUkZcjQ886vXj3ASEq4wQ1J2HTtWboaEopiRcamXSBIjPEID0tWUo5BIN5ueMLaOtdK3gkjo4sqaqr83MhRKmYa+ngyRGspFbyL+53UTFVy6GeVxogjHs4eChFkqsiZ5WH0qCFYs1QRhQfVfLTxEAmGlUyvpEJzFk/+S1lnVsavO7Xm5dpPHUYRDOIIKOHABNbiGBjQBwyM8wyu8GU/Gi/FufMxGC0a+sw9zMD5/ABuOlqY=</latexit><latexit sha1_base64="7oYrTdky232nQB6U3CS0rfLHjI0=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFFXUsE+oA1hMp20QyeTMDMRQyhu/BU3LhRx61e482+ctlnY6oHLPZxzLzP3+DGjUtn2t1FYWl5ZXSuulzY2t7Z3zN29lowSgUkTRywSHR9JwignTUUVI51YEBT6jLT90dXEb98TIWnE71QaEzdEA04DipHSkmce1Cv1ysOp9LJ0fDLXPLNsV+0prL/EyUkZcjQ886vXj3ASEq4wQ1J2HTtWboaEopiRcamXSBIjPEID0tWUo5BIN5ueMLaOtdK3gkjo4sqaqr83MhRKmYa+ngyRGspFbyL+53UTFVy6GeVxogjHs4eChFkqsiZ5WH0qCFYs1QRhQfVfLTxEAmGlUyvpEJzFk/+S1lnVsavO7Xm5dpPHUYRDOIIKOHABNbiGBjQBwyM8wyu8GU/Gi/FufMxGC0a+sw9zMD5/ABuOlqY=</latexit><latexit sha1_base64="7oYrTdky232nQB6U3CS0rfLHjI0=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFFXUsE+oA1hMp20QyeTMDMRQyhu/BU3LhRx61e482+ctlnY6oHLPZxzLzP3+DGjUtn2t1FYWl5ZXSuulzY2t7Z3zN29lowSgUkTRywSHR9JwignTUUVI51YEBT6jLT90dXEb98TIWnE71QaEzdEA04DipHSkmce1Cv1ysOp9LJ0fDLXPLNsV+0prL/EyUkZcjQ886vXj3ASEq4wQ1J2HTtWboaEopiRcamXSBIjPEID0tWUo5BIN5ueMLaOtdK3gkjo4sqaqr83MhRKmYa+ngyRGspFbyL+53UTFVy6GeVxogjHs4eChFkqsiZ5WH0qCFYs1QRhQfVfLTxEAmGlUyvpEJzFk/+S1lnVsavO7Xm5dpPHUYRDOIIKOHABNbiGBjQBwyM8wyu8GU/Gi/FufMxGC0a+sw9zMD5/ABuOlqY=</latexit>

G(G(G(x, sy), sy), sx)<latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="hP+6LrUf2d3tZaldqaQQvEKMXyw=">AAAB2XicbZDNSgMxFIXv1L86Vq1rN8EiuCozbnQpuHFZwbZCO5RM5k4bmskMyR2hDH0BF25EfC93vo3pz0JbDwQ+zknIvSculLQUBN9ebWd3b/+gfugfNfzjk9Nmo2fz0gjsilzl5jnmFpXU2CVJCp8LgzyLFfbj6f0i77+gsTLXTzQrMMr4WMtUCk7O6oyaraAdLMW2IVxDC9YaNb+GSS7KDDUJxa0dhEFBUcUNSaFw7g9LiwUXUz7GgUPNM7RRtRxzzi6dk7A0N+5oYkv394uKZ9bOstjdzDhN7Ga2MP/LBiWlt1EldVESarH6KC0Vo5wtdmaJNChIzRxwYaSblYkJN1yQa8Z3HYSbG29D77odBu3wMYA6nMMFXEEIN3AHD9CBLghI4BXevYn35n2suqp569LO4I+8zx84xIo4</latexit><latexit sha1_base64="YVEYLjwf3dhwFKbtCav4ZYvi/Ic=">AAAB+nicbVDLSsNAFL2pr1qrRre6CBahBSmJG10KLnRZwT6gDWEynbRDJ5MwM5GGkI0bf8WNC0X8EHf+jZO2i9p6hmEO59zh3nv8mFGpbPvHKG1sbm3vlHcre9X9g0PzqNqRUSIwaeOIRaLnI0kY5aStqGKkFwuCQp+Rrj+5LfzuExGSRvxRpTFxQzTiNKAYKS155uldvTjTC+llad5YeqZ5wzNrdtOewVonzoLUYIGWZ34PhhFOQsIVZkjKvmPHys2QUBQzklcGiSQxwhM0In1NOQqJdLPZFrl1rpWhFURCX66smbr8I0OhlGno68oQqbFc9QrxP6+fqODazSiPE0U4njcKEmapyCoisYZUEKxYqgnCgupZLTxGAmGlg6voEJzVlddJ57Lp2E3nwYYynMAZ1MGBK7iBe2hBGzA8wyu8w4fxYrwZn/O4SsYit2P4A+PrF//sld8=</latexit><latexit sha1_base64="YVEYLjwf3dhwFKbtCav4ZYvi/Ic=">AAAB+nicbVDLSsNAFL2pr1qrRre6CBahBSmJG10KLnRZwT6gDWEynbRDJ5MwM5GGkI0bf8WNC0X8EHf+jZO2i9p6hmEO59zh3nv8mFGpbPvHKG1sbm3vlHcre9X9g0PzqNqRUSIwaeOIRaLnI0kY5aStqGKkFwuCQp+Rrj+5LfzuExGSRvxRpTFxQzTiNKAYKS155uldvTjTC+llad5YeqZ5wzNrdtOewVonzoLUYIGWZ34PhhFOQsIVZkjKvmPHys2QUBQzklcGiSQxwhM0In1NOQqJdLPZFrl1rpWhFURCX66smbr8I0OhlGno68oQqbFc9QrxP6+fqODazSiPE0U4njcKEmapyCoisYZUEKxYqgnCgupZLTxGAmGlg6voEJzVlddJ57Lp2E3nwYYynMAZ1MGBK7iBe2hBGzA8wyu8w4fxYrwZn/O4SsYit2P4A+PrF//sld8=</latexit><latexit sha1_base64="rDKYDDp/BZz4c9Jcvc+YxfUubSE=">AAACBXicbVC7TsMwFHXKq5RXgBEGiwqplVCVsMBYiQEmVCT6kNooclynteo4ke2gRlEWFn6FhQGEWPkHNv4Gp80ALcey7tE598q+x4sYlcqyvo3Syura+kZ5s7K1vbO7Z+4fdGQYC0zaOGSh6HlIEkY5aSuqGOlFgqDAY6TrTa5yv/tAhKQhv1dJRJwAjTj1KUZKS655fF3Lz/RMummS1X+VaVZ3zarVsGaAy8QuSBUUaLnm12AY4jggXGGGpOzbVqScFAlFMSNZZRBLEiE8QSPS15SjgEgnnW2RwVOtDKEfCn25gjP190SKAimTwNOdAVJjuejl4n9eP1b+pZNSHsWKcDx/yI8ZVCHMI4FDKghWLNEEYUH1XyEeI4Gw0sFVdAj24srLpHPesK2GfWdVm7dFHGVwBE5ADdjgAjTBDWiBNsDgETyDV/BmPBkvxrvxMW8tGcXMIfgD4/MHg1iXVw==</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit><latexit sha1_base64="LtEl1rcjnuwAEY0Cr+5ixvOyx5E=">AAACBXicbVBNS8MwGE7n15xfVY96CA5hAxmtCHoceNCTTHAfsJWSZtkWlqYlSWWl9OLFv+LFgyJe/Q/e/DemWw9z8wnhfXie9yV5Hy9kVCrL+jEKK6tr6xvFzdLW9s7unrl/0JJBJDBp4oAFouMhSRjlpKmoYqQTCoJ8j5G2N77O/PYjEZIG/EHFIXF8NOR0QDFSWnLN45tKdiZn0k3itDpXJmnVNctWzZoCLhM7J2WQo+Ga371+gCOfcIUZkrJrW6FyEiQUxYykpV4kSYjwGA1JV1OOfCKdZLpFCk+10oeDQOjLFZyq8xMJ8qWMfU93+kiN5KKXif953UgNrpyE8jBShOPZQ4OIQRXALBLYp4JgxWJNEBZU/xXiERIIKx1cSYdgL668TFrnNduq2fcX5fpdHkcRHIETUAE2uAR1cAsaoAkweAIv4A28G8/Gq/FhfM5aC0Y+cwj+wPj6BYSYl1s=</latexit>

G(x, sy), sy<latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit>

y, sy<latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit><latexit sha1_base64="m34yu07tDgdWJHpkKfhHKiwWYUI=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBg5REBD0WvHisYD+gDWWznbRLN5uwuxFCKPgXvHhQxKu/x5v/xk3bg7Y+GHi8N8PMvCARXBvX/XZKa+sbm1vl7crO7t7+QfXwqK3jVDFssVjEqhtQjYJLbBluBHYThTQKBHaCyW3hdx5RaR7LB5Ml6Ed0JHnIGTVW6mQXepBn00G15tbdGcgq8RakBgs0B9Wv/jBmaYTSMEG17nluYvycKsOZwGmln2pMKJvQEfYslTRC7eezc6fkzCpDEsbKljRkpv6eyGmkdRYFtjOiZqyXvUL8z+ulJrzxcy6T1KBk80VhKoiJSfE7GXKFzIjMEsoUt7cSNqaKMmMTqtgQvOWXV0n7su65de/+qtZoP83jKMMJnMI5eHANDbiDJrSAwQSe4RXenMR5cd6dj3lryVlEeAx/4Hz+AKnIkDw=</latexit>

Dd<latexit sha1_base64="TgIwKo2xQLJqEOFculW1CjEplPw=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx4rmLbQxrLZTNulm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZTW1jc2t8rblZ3dvf2D6uFRSyeZYuizRCSqE1KNgkv0DTcCO6lCGocC2+H4Zua3n1BpnsgHM0kxiOlQ8gFn1FjJv+3n0bRfrbl1dw6ySryC1KBAs1/96kUJy2KUhgmqdddzUxPkVBnOBE4rvUxjStmYDrFrqaQx6iCfHzslZ1aJyCBRtqQhc/X3RE5jrSdxaDtjakZ62ZuJ/3ndzAyug5zLNDMo2WLRIBPEJGT2OYm4QmbExBLKFLe3EjaiijJj86nYELzll1dJ66LuuXXv/rLWeCziKMMJnMI5eHAFDbiDJvjAgMMzvMKbI50X5935WLSWnGLmGP7A+fwB1pGOxw==</latexit><latexit sha1_base64="TgIwKo2xQLJqEOFculW1CjEplPw=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx4rmLbQxrLZTNulm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZTW1jc2t8rblZ3dvf2D6uFRSyeZYuizRCSqE1KNgkv0DTcCO6lCGocC2+H4Zua3n1BpnsgHM0kxiOlQ8gFn1FjJv+3n0bRfrbl1dw6ySryC1KBAs1/96kUJy2KUhgmqdddzUxPkVBnOBE4rvUxjStmYDrFrqaQx6iCfHzslZ1aJyCBRtqQhc/X3RE5jrSdxaDtjakZ62ZuJ/3ndzAyug5zLNDMo2WLRIBPEJGT2OYm4QmbExBLKFLe3EjaiijJj86nYELzll1dJ66LuuXXv/rLWeCziKMMJnMI5eHAFDbiDJvjAgMMzvMKbI50X5935WLSWnGLmGP7A+fwB1pGOxw==</latexit><latexit sha1_base64="TgIwKo2xQLJqEOFculW1CjEplPw=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx4rmLbQxrLZTNulm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZTW1jc2t8rblZ3dvf2D6uFRSyeZYuizRCSqE1KNgkv0DTcCO6lCGocC2+H4Zua3n1BpnsgHM0kxiOlQ8gFn1FjJv+3n0bRfrbl1dw6ySryC1KBAs1/96kUJy2KUhgmqdddzUxPkVBnOBE4rvUxjStmYDrFrqaQx6iCfHzslZ1aJyCBRtqQhc/X3RE5jrSdxaDtjakZ62ZuJ/3ndzAyug5zLNDMo2WLRIBPEJGT2OYm4QmbExBLKFLe3EjaiijJj86nYELzll1dJ66LuuXXv/rLWeCziKMMJnMI5eHAFDbiDJvjAgMMzvMKbI50X5935WLSWnGLmGP7A+fwB1pGOxw==</latexit><latexit sha1_base64="TgIwKo2xQLJqEOFculW1CjEplPw=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GNBDx4rmLbQxrLZTNulm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZTW1jc2t8rblZ3dvf2D6uFRSyeZYuizRCSqE1KNgkv0DTcCO6lCGocC2+H4Zua3n1BpnsgHM0kxiOlQ8gFn1FjJv+3n0bRfrbl1dw6ySryC1KBAs1/96kUJy2KUhgmqdddzUxPkVBnOBE4rvUxjStmYDrFrqaQx6iCfHzslZ1aJyCBRtqQhc/X3RE5jrSdxaDtjakZ62ZuJ/3ndzAyug5zLNDMo2WLRIBPEJGT2OYm4QmbExBLKFLe3EjaiijJj86nYELzll1dJ66LuuXXv/rLWeCziKMMJnMI5eHAFDbiDJvjAgMMzvMKbI50X5935WLSWnGLmGP7A+fwB1pGOxw==</latexit>

G(G(x, sy), sy), sx<latexit sha1_base64="/F+xoDMK6Q/UWij4wgGeME/EcVg=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFGXFewD2hAm00k7dDIJMxNpCEUX/oobF4q49Svc+TdOH4vaeuByD+fcy8w9fsyoVLb9Y+RWVtfWN/Kbha3tnd09c/+gIaNEYFLHEYtEy0eSMMpJXVHFSCsWBIU+I01/cDP2mw9ESBrxe5XGxA1Rj9OAYqS05JlH1VK1NDyXXpaOzubacOSZRbtsT2AtE2dGijBDzTO/O90IJyHhCjMkZduxY+VmSCiKGRkVOokkMcID1CNtTTkKiXSzyQkj61QrXSuIhC6urIk6v5GhUMo09PVkiFRfLnpj8T+vnajg2s0ojxNFOJ4+FCTMUpE1zsPqUkGwYqkmCAuq/2rhPhIIK51aQYfgLJ68TBoXZccuO3eXxUrjaRpHHo7hBErgwBVU4BZqUAcMj/ACb/BuPBuvxofxOR3NGbMID+EPjK9fQnGXLQ==</latexit><latexit sha1_base64="/F+xoDMK6Q/UWij4wgGeME/EcVg=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFGXFewD2hAm00k7dDIJMxNpCEUX/oobF4q49Svc+TdOH4vaeuByD+fcy8w9fsyoVLb9Y+RWVtfWN/Kbha3tnd09c/+gIaNEYFLHEYtEy0eSMMpJXVHFSCsWBIU+I01/cDP2mw9ESBrxe5XGxA1Rj9OAYqS05JlH1VK1NDyXXpaOzubacOSZRbtsT2AtE2dGijBDzTO/O90IJyHhCjMkZduxY+VmSCiKGRkVOokkMcID1CNtTTkKiXSzyQkj61QrXSuIhC6urIk6v5GhUMo09PVkiFRfLnpj8T+vnajg2s0ojxNFOJ4+FCTMUpE1zsPqUkGwYqkmCAuq/2rhPhIIK51aQYfgLJ68TBoXZccuO3eXxUrjaRpHHo7hBErgwBVU4BZqUAcMj/ACb/BuPBuvxofxOR3NGbMID+EPjK9fQnGXLQ==</latexit><latexit sha1_base64="/F+xoDMK6Q/UWij4wgGeME/EcVg=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFGXFewD2hAm00k7dDIJMxNpCEUX/oobF4q49Svc+TdOH4vaeuByD+fcy8w9fsyoVLb9Y+RWVtfWN/Kbha3tnd09c/+gIaNEYFLHEYtEy0eSMMpJXVHFSCsWBIU+I01/cDP2mw9ESBrxe5XGxA1Rj9OAYqS05JlH1VK1NDyXXpaOzubacOSZRbtsT2AtE2dGijBDzTO/O90IJyHhCjMkZduxY+VmSCiKGRkVOokkMcID1CNtTTkKiXSzyQkj61QrXSuIhC6urIk6v5GhUMo09PVkiFRfLnpj8T+vnajg2s0ojxNFOJ4+FCTMUpE1zsPqUkGwYqkmCAuq/2rhPhIIK51aQYfgLJ68TBoXZccuO3eXxUrjaRpHHo7hBErgwBVU4BZqUAcMj/ACb/BuPBuvxofxOR3NGbMID+EPjK9fQnGXLQ==</latexit><latexit sha1_base64="/F+xoDMK6Q/UWij4wgGeME/EcVg=">AAACAnicbVDLSsNAFL2pr1pfUVfiJliEClISEXRZcFGXFewD2hAm00k7dDIJMxNpCEUX/oobF4q49Svc+TdOH4vaeuByD+fcy8w9fsyoVLb9Y+RWVtfWN/Kbha3tnd09c/+gIaNEYFLHEYtEy0eSMMpJXVHFSCsWBIU+I01/cDP2mw9ESBrxe5XGxA1Rj9OAYqS05JlH1VK1NDyXXpaOzubacOSZRbtsT2AtE2dGijBDzTO/O90IJyHhCjMkZduxY+VmSCiKGRkVOokkMcID1CNtTTkKiXSzyQkj61QrXSuIhC6urIk6v5GhUMo09PVkiFRfLnpj8T+vnajg2s0ojxNFOJ4+FCTMUpE1zsPqUkGwYqkmCAuq/2rhPhIIK51aQYfgLJ68TBoXZccuO3eXxUrjaRpHHo7hBErgwBVU4BZqUAcMj/ACb/BuPBuvxofxOR3NGbMID+EPjK9fQnGXLQ==</latexit>

Concat

Lcls(Dd)<latexit sha1_base64="g999SMOj7nI0MTKr7h9kw+eov6w=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9CGsNls2qWbTdjdCCVEBP+KFw+KePVfePPfuGl70NYHA4/3ZpiZ5yeMSmXb30ZpaXllda28XtnY3NreMXf32jJOBSYtHLNYdH0kCaOctBRVjHQTQVDkM9LxR5eF37knQtKY36lxQtwIDTgNKUZKS5550I+QGmLEspvcyzCTee3Ky4L8xDOrdt2ewFokzoxUYYamZ371gxinEeEKMyRlz7ET5WZIKIoZySv9VJIE4REakJ6mHEVEutnkg9w61kpghbHQxZU1UX9PZCiSchz5urO4V857hfif10tVeOFmlCepIhxPF4Ups1RsFXFYARUEKzbWBGFB9a0WHiKBsNKhVXQIzvzLi6R9WnfsunN7Vm20H6dxlOEQjqAGDpxDA66hCS3A8ADP8ApvxpPxYrwbH9PWkjGLcB/+wPj8AdHVl5I=</latexit><latexit sha1_base64="g999SMOj7nI0MTKr7h9kw+eov6w=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9CGsNls2qWbTdjdCCVEBP+KFw+KePVfePPfuGl70NYHA4/3ZpiZ5yeMSmXb30ZpaXllda28XtnY3NreMXf32jJOBSYtHLNYdH0kCaOctBRVjHQTQVDkM9LxR5eF37knQtKY36lxQtwIDTgNKUZKS5550I+QGmLEspvcyzCTee3Ky4L8xDOrdt2ewFokzoxUYYamZ371gxinEeEKMyRlz7ET5WZIKIoZySv9VJIE4REakJ6mHEVEutnkg9w61kpghbHQxZU1UX9PZCiSchz5urO4V857hfif10tVeOFmlCepIhxPF4Ups1RsFXFYARUEKzbWBGFB9a0WHiKBsNKhVXQIzvzLi6R9WnfsunN7Vm20H6dxlOEQjqAGDpxDA66hCS3A8ADP8ApvxpPxYrwbH9PWkjGLcB/+wPj8AdHVl5I=</latexit><latexit sha1_base64="g999SMOj7nI0MTKr7h9kw+eov6w=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9CGsNls2qWbTdjdCCVEBP+KFw+KePVfePPfuGl70NYHA4/3ZpiZ5yeMSmXb30ZpaXllda28XtnY3NreMXf32jJOBSYtHLNYdH0kCaOctBRVjHQTQVDkM9LxR5eF37knQtKY36lxQtwIDTgNKUZKS5550I+QGmLEspvcyzCTee3Ky4L8xDOrdt2ewFokzoxUYYamZ371gxinEeEKMyRlz7ET5WZIKIoZySv9VJIE4REakJ6mHEVEutnkg9w61kpghbHQxZU1UX9PZCiSchz5urO4V857hfif10tVeOFmlCepIhxPF4Ups1RsFXFYARUEKzbWBGFB9a0WHiKBsNKhVXQIzvzLi6R9WnfsunN7Vm20H6dxlOEQjqAGDpxDA66hCS3A8ADP8ApvxpPxYrwbH9PWkjGLcB/+wPj8AdHVl5I=</latexit><latexit sha1_base64="g999SMOj7nI0MTKr7h9kw+eov6w=">AAACAXicbVBNS8NAEJ3Ur1q/ol4EL8Ei1EtJRNBjQQ8ePFSwH9CGsNls2qWbTdjdCCVEBP+KFw+KePVfePPfuGl70NYHA4/3ZpiZ5yeMSmXb30ZpaXllda28XtnY3NreMXf32jJOBSYtHLNYdH0kCaOctBRVjHQTQVDkM9LxR5eF37knQtKY36lxQtwIDTgNKUZKS5550I+QGmLEspvcyzCTee3Ky4L8xDOrdt2ewFokzoxUYYamZ371gxinEeEKMyRlz7ET5WZIKIoZySv9VJIE4REakJ6mHEVEutnkg9w61kpghbHQxZU1UX9PZCiSchz5urO4V857hfif10tVeOFmlCepIhxPF4Ups1RsFXFYARUEKzbWBGFB9a0WHiKBsNKhVXQIzvzLi6R9WnfsunN7Vm20H6dxlOEQjqAGDpxDA66hCS3A8ADP8ApvxpPxYrwbH9PWkjGLcB/+wPj8AdHVl5I=</latexit>

y<latexit sha1_base64="thLS80HyLWaSgIUEpwOdLuHenrM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoiMsW7ANqkWQ6rUPzIjMRStEfcKvfJv6B/oV3xhTUIjohyZlz7zkz914/CYRUjvNasBYWl5ZXiqultfWNza3y9k5bxlnKeIvFQZx2fU/yQES8pYQKeDdJuRf6Ae/443Md79zxVIo4ulKThPdDbxSJoWCeIqo5uSlXnKpjlj0P3BxUkK9GXH7BNQaIwZAhBEcERTiAB0lPDy4cJMT1MSUuJSRMnOMeJdJmlMUpwyN2TN8R7Xo5G9Fee0qjZnRKQG9KShsHpIkpLyWsT7NNPDPOmv3Ne2o89d0m9Pdzr5BYhVti/9LNMv+r07UoDHFqahBUU2IYXR3LXTLTFX1z+0tVihwS4jQeUDwlzIxy1mfbaKSpXffWM/E3k6lZvWd5boZ3fUsasPtznPOgXau6R9Va87hSP8tHXcQe9nFI8zxBHZdooGW8H/GEZ+vCCixpZZ+pViHX7OLbsh4+AG7Kj4I=</latexit>

y<latexit sha1_base64="thLS80HyLWaSgIUEpwOdLuHenrM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoiMsW7ANqkWQ6rUPzIjMRStEfcKvfJv6B/oV3xhTUIjohyZlz7zkz914/CYRUjvNasBYWl5ZXiqultfWNza3y9k5bxlnKeIvFQZx2fU/yQES8pYQKeDdJuRf6Ae/443Md79zxVIo4ulKThPdDbxSJoWCeIqo5uSlXnKpjlj0P3BxUkK9GXH7BNQaIwZAhBEcERTiAB0lPDy4cJMT1MSUuJSRMnOMeJdJmlMUpwyN2TN8R7Xo5G9Fee0qjZnRKQG9KShsHpIkpLyWsT7NNPDPOmv3Ne2o89d0m9Pdzr5BYhVti/9LNMv+r07UoDHFqahBUU2IYXR3LXTLTFX1z+0tVihwS4jQeUDwlzIxy1mfbaKSpXffWM/E3k6lZvWd5boZ3fUsasPtznPOgXau6R9Va87hSP8tHXcQe9nFI8zxBHZdooGW8H/GEZ+vCCixpZZ+pViHX7OLbsh4+AG7Kj4I=</latexit>

sy<latexit sha1_base64="U5GjLwVr/t5269faQYAKi5fovvQ=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LLopsuK9gG1lGQ6rUPTJEwmSimCP+BWP038A/0L74wpqEV0QpIz595zZu69fhyIRDnOa85aWFxaXsmvFtbWNza3its7zSRKJeMNFgWRbPtewgMR8oYSKuDtWHJv7Ae85Y/Odbx1y2UiovBKTWLeHXvDUAwE8xRRl0lv0iuWnLJjlj0P3AyUkK16VHzBNfqIwJBiDI4QinAADwk9HbhwEBPXxZQ4SUiYOMc9CqRNKYtThkfsiL5D2nUyNqS99kyMmtEpAb2SlDYOSBNRniSsT7NNPDXOmv3Ne2o89d0m9PczrzGxCjfE/qWbZf5Xp2tRGODU1CCoptgwujqWuaSmK/rm9peqFDnExGncp7gkzIxy1mfbaBJTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOOdBs1J2j8qVi+NS9SwbdR572MchzfMEVdRQR4O8h3jEE56tmhVaqXX3mWrlMs0uvi3r4QOub5Bo</latexit>

x<latexit sha1_base64="dWAOxaM1Wan6zmgdHuTSuCpjqgQ=">AAACxXicjVHLSsNAFD2Nr1pfVZdugkVwVdIq6LLoQpdV7ANqkWQ6rUPzIpkUSxF/wK3+mvgH+hfeGaegFtEJSc6ce8+Zufd6sS9S6TivOWtufmFxKb9cWFldW98obm410yhLGG+wyI+Stuem3Bchb0ghfd6OE+4Gns9b3vBUxVsjnqQiCq/kOObdwB2Eoi+YK4m6vCvcFEtO2dHLngUVA0owqx4VX3CNHiIwZAjAEUIS9uEipaeDChzExHUxIS4hJHSc4x4F0maUxSnDJXZI3wHtOoYNaa88U61mdIpPb0JKG3ukiSgvIaxOs3U8086K/c17oj3V3cb094xXQKzELbF/6aaZ/9WpWiT6ONY1CKop1oyqjhmXTHdF3dz+UpUkh5g4hXsUTwgzrZz22daaVNeueuvq+JvOVKzaM5Ob4V3dkgZc+TnOWdCslisH5erFYal2Ykadxw52sU/zPEIN56ijQd59POIJz9aZFVjSGn2mWjmj2ca3ZT18AKpij5U=</latexit>

x<latexit sha1_base64="dWAOxaM1Wan6zmgdHuTSuCpjqgQ=">AAACxXicjVHLSsNAFD2Nr1pfVZdugkVwVdIq6LLoQpdV7ANqkWQ6rUPzIpkUSxF/wK3+mvgH+hfeGaegFtEJSc6ce8+Zufd6sS9S6TivOWtufmFxKb9cWFldW98obm410yhLGG+wyI+Stuem3Bchb0ghfd6OE+4Gns9b3vBUxVsjnqQiCq/kOObdwB2Eoi+YK4m6vCvcFEtO2dHLngUVA0owqx4VX3CNHiIwZAjAEUIS9uEipaeDChzExHUxIS4hJHSc4x4F0maUxSnDJXZI3wHtOoYNaa88U61mdIpPb0JKG3ukiSgvIaxOs3U8086K/c17oj3V3cb094xXQKzELbF/6aaZ/9WpWiT6ONY1CKop1oyqjhmXTHdF3dz+UpUkh5g4hXsUTwgzrZz22daaVNeueuvq+JvOVKzaM5Ob4V3dkgZc+TnOWdCslisH5erFYal2Ykadxw52sU/zPEIN56ijQd59POIJz9aZFVjSGn2mWjmj2ca3ZT18AKpij5U=</latexit>

sy<latexit sha1_base64="U5GjLwVr/t5269faQYAKi5fovvQ=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LLopsuK9gG1lGQ6rUPTJEwmSimCP+BWP038A/0L74wpqEV0QpIz595zZu69fhyIRDnOa85aWFxaXsmvFtbWNza3its7zSRKJeMNFgWRbPtewgMR8oYSKuDtWHJv7Ae85Y/Odbx1y2UiovBKTWLeHXvDUAwE8xRRl0lv0iuWnLJjlj0P3AyUkK16VHzBNfqIwJBiDI4QinAADwk9HbhwEBPXxZQ4SUiYOMc9CqRNKYtThkfsiL5D2nUyNqS99kyMmtEpAb2SlDYOSBNRniSsT7NNPDXOmv3Ne2o89d0m9PczrzGxCjfE/qWbZf5Xp2tRGODU1CCoptgwujqWuaSmK/rm9peqFDnExGncp7gkzIxy1mfbaBJTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOOdBs1J2j8qVi+NS9SwbdR572MchzfMEVdRQR4O8h3jEE56tmhVaqXX3mWrlMs0uvi3r4QOub5Bo</latexit>

sx<latexit sha1_base64="znr6jpBDjQP2S81gGIfEvIHxXiY=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1qFpEjITtRTBH3Crnyb+gf6Fd8YU1CI6IcmZc+85M/dePw6EVI7zmrPm5hcWl/LLhZXVtfWN4uZWQ0ZpwnidRUGUtHxP8kCEvK6ECngrTrg38gPe9IdnOt684YkUUXipxjHvjLxBKPqCeYqoC9m96xZLTtkxy54FbgZKyFYtKr7gCj1EYEgxAkcIRTiAB0lPGy4cxMR1MCEuISRMnOMeBdKmlMUpwyN2SN8B7doZG9Jee0qjZnRKQG9CSht7pIkoLyGsT7NNPDXOmv3Ne2I89d3G9PczrxGxCtfE/qWbZv5Xp2tR6OPE1CCoptgwujqWuaSmK/rm9peqFDnExGnco3hCmBnltM+20UhTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOGdB46DsHpYPzo9KldNs1HnsYBf7NM9jVFBFDXXyHuART3i2qlZopdbtZ6qVyzTb+Lashw+sD5Bn</latexit>

sx<latexit sha1_base64="znr6jpBDjQP2S81gGIfEvIHxXiY=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1qFpEjITtRTBH3Crnyb+gf6Fd8YU1CI6IcmZc+85M/dePw6EVI7zmrPm5hcWl/LLhZXVtfWN4uZWQ0ZpwnidRUGUtHxP8kCEvK6ECngrTrg38gPe9IdnOt684YkUUXipxjHvjLxBKPqCeYqoC9m96xZLTtkxy54FbgZKyFYtKr7gCj1EYEgxAkcIRTiAB0lPGy4cxMR1MCEuISRMnOMeBdKmlMUpwyN2SN8B7doZG9Jee0qjZnRKQG9CSht7pIkoLyGsT7NNPDXOmv3Ne2I89d3G9PczrxGxCtfE/qWbZv5Xp2tR6OPE1CCoptgwujqWuaSmK/rm9peqFDnExGnco3hCmBnltM+20UhTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOGdB46DsHpYPzo9KldNs1HnsYBf7NM9jVFBFDXXyHuART3i2qlZopdbtZ6qVyzTb+Lashw+sD5Bn</latexit>

sy<latexit sha1_base64="U5GjLwVr/t5269faQYAKi5fovvQ=">AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LLopsuK9gG1lGQ6rUPTJEwmSimCP+BWP038A/0L74wpqEV0QpIz595zZu69fhyIRDnOa85aWFxaXsmvFtbWNza3its7zSRKJeMNFgWRbPtewgMR8oYSKuDtWHJv7Ae85Y/Odbx1y2UiovBKTWLeHXvDUAwE8xRRl0lv0iuWnLJjlj0P3AyUkK16VHzBNfqIwJBiDI4QinAADwk9HbhwEBPXxZQ4SUiYOMc9CqRNKYtThkfsiL5D2nUyNqS99kyMmtEpAb2SlDYOSBNRniSsT7NNPDXOmv3Ne2o89d0m9PczrzGxCjfE/qWbZf5Xp2tRGODU1CCoptgwujqWuaSmK/rm9peqFDnExGncp7gkzIxy1mfbaBJTu+6tZ+JvJlOzes+y3BTv+pY0YPfnOOdBs1J2j8qVi+NS9SwbdR572MchzfMEVdRQR4O8h3jEE56tmhVaqXX3mWrlMs0uvi3r4QOub5Bo</latexit>

G(x, sy), sy<latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit><latexit sha1_base64="b6kRUsiaUlU8cI2IKKkh+Ijelfg=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBEqSElE0GXBhS4r2Ae0IUymk3boZBJmJsUQCn6IGxeKuPVP3Pk3TpsutPXA5R7OuZe5c4KEM6Ud59taWV1b39gsbZW3d3b39u2Dw5aKU0lok8Q8lp0AK8qZoE3NNKedRFIcBZy2g9HN1G+PqVQsFg86S6gX4YFgISNYG8m37dvq47ny82xyVjTfrjg1Zwa0TNw5qcAcDd/+6vVjkkZUaMKxUl3XSbSXY6kZ4XRS7qWKJpiM8IB2DRU4osrLZ5dP0KlR+iiMpSmh0Uz9vZHjSKksCsxkhPVQLXpT8T+vm+rw2suZSFJNBSkeClOOdIymMaA+k5RonhmCiWTmVkSGWGKiTVhlE4K7+OVl0rqouU7Nvb+s1FtPRRwlOIYTqIILV1CHO2hAEwiM4Rle4c3KrRfr3fooRleseYRH8AfW5w8ko5PN</latexit>

G<latexit sha1_base64="62MSal9IrrJAByTXj3PzAZZ4ydM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoqMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPL8pV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfrzBKWtLLPVKuQa7bxbVkPH/f7j1A=</latexit>

G<latexit sha1_base64="62MSal9IrrJAByTXj3PzAZZ4ydM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoqMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPL8pV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfrzBKWtLLPVKuQa7bxbVkPH/f7j1A=</latexit>

G<latexit sha1_base64="62MSal9IrrJAByTXj3PzAZZ4ydM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoqMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPL8pV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfrzBKWtLLPVKuQa7bxbVkPH/f7j1A=</latexit>

G<latexit sha1_base64="62MSal9IrrJAByTXj3PzAZZ4ydM=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LIoqMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPL8pV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfrzBKWtLLPVKuQa7bxbVkPH/f7j1A=</latexit>

G(x, sy), sx<latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit><latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit><latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit><latexit sha1_base64="GgHqNdPmg9iDMjWnd9VZhN+8eio=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQQUoigi4LLnRZwT6gDWEynbRDJw9mJqUhBPwQNy4UceufuPNvnLRdaOuByz2ccy9z53gxZ1JZ1rdRWlvf2Nwqb1d2dvf2D8zDo7aMEkFoi0Q8El0PS8pZSFuKKU67saA48DjteOPbwu9MqJAsCh9VGlMnwMOQ+YxgpSXXNO9q0wvpZml+XrRp7ppVq27NgFaJvSBVWKDpml/9QUSSgIaKcCxlz7Zi5WRYKEY4zSv9RNIYkzEe0p6mIQ6odLLZ5Tk608oA+ZHQFSo0U39vZDiQMg08PRlgNZLLXiH+5/US5d84GQvjRNGQzB/yE45UhIoY0IAJShRPNcFEMH0rIiMsMFE6rIoOwV7+8ippX9Ztq24/XFUb7ad5HGU4gVOogQ3X0IB7aEILCEzgGV7hzciMF+Pd+JiPloxFhMfwB8bnDyMek8w=</latexit>

D<latexit sha1_base64="Ua3ZUDwv3oP+zXDWjFsiLRMRAws=">AAACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZIq6LKoiMsWbCvUIsl0WkMnDzIToRT9Abf6beIf6F94Z0xBLaITkpw5954zc+/1ExFI5TivBWtufmFxqbhcWlldW98ob261ZZyljLdYLOL0yvckF0HEWypQgl8lKfdCX/COPzrV8c4dT2UQR5dqnPBe6A2jYBAwTxHVPLspV5yqY5Y9C9wcVJCvRlx+wTX6iMGQIQRHBEVYwIOkpwsXDhLiepgQlxIKTJzjHiXSZpTFKcMjdkTfIe26ORvRXntKo2Z0iqA3JaWNPdLElJcS1qfZJp4ZZ83+5j0xnvpuY/r7uVdIrMItsX/pppn/1elaFAY4NjUEVFNiGF0dy10y0xV9c/tLVYocEuI07lM8JcyMctpn22ikqV331jPxN5OpWb1neW6Gd31LGrD7c5yzoF2rugfVWvOwUj/JR13EDnaxT/M8Qh0XaKBlvB/xhGfr3BKWtLLPVKuQa7bxbVkPH/Dbj00=</latexit>

VGG-19

Figure 5: The detailed ReshapeGAN model for reshaping by cross-domain guidance with unpaired data.

the whole task into two sub-tasks: domain-preserved re-shaping and refining. The detailed framework is shown inFig. 5.

For the first stage, we add a domain classification lossto the overall loss function of Lunpaired

ReshapeGAN (G,D) to en-sure proper domain preservation during the reshaping. Ingreater details, the overall loss becomes

Lcross−domain 1ReshapeGAN (G,D,Dd)

= LunpairedReshapeGAN (G,D) + λLcls(Dd)

= Lsadv(G,D) + γLperturb(D)

+ δLunpairedpixel (G) + σLunpaired

percep (G) + λLcls(Dd),(10)

where Dd is a domain classifier and Lcls(Dd) is the do-main classification loss, with λ as its weight.Domain classification loss Lcls(Dd). The domain classi-fier Dd is expected to be able to help ensuring the domainpreserving, leading to the effect of appearance preserving,i.e., maintaining the appearance and identity of the objectduring the reshaping. More specifically, Lcls(Dd) is de-fined as

Lcls(Dd)

= Ex [− logDd(x, dx)] + Ey [− logDd(y, dy)]

+ Ex,y [− logDd(G(x, sy), dx)] ,

(11)

where dx and dy denote the domain labels of x ∈ X andy ∈ Y , respectively. Here we derive this loss from Star-GAN [10], and we use the one-hot encoding to capture

the domain distribution. Our strategy is similar to that inthe work of MUNIT [23], which tries to learn disentan-gled representation to get more expressive and represen-tative style information. Following the CRN [7], we usethe channel-wise concatenation to integrate the image anddomain label (e.g. x and dx) for the classifier Dd.

In the second stage, we apply the object reshapingagain by making use of the Lunpaired

ReshapeGAN (G,D), so thata second round of reshaping (should be significantly mi-nor than that in the first stage) can be performed. We takethe output of the first stage as the input of this stage. Theoverall loss for this stage is just

Lcross−domain 2ReshapeGAN (G,D,Dd) = Lunpaired

ReshapeGAN (G,D)

= Lsadv(G,D) + γLperturb(D)

+ δLunpairedpixel (G) + σLunpaired

percep (G).(12)

The reason for splitting a difficult cross-domain objectreshaping task to two easier sub-tasks is that we can getbetter performance than that of a one-stage generation(using only the first stage). In the one-stage generation(the first stage of our proposal), two types of information(domain/style information d and shape/geometric infor-mation s) are used as constrains, making it hard to ensurequality reshaping. Ablation study will be given on verify-ing this in the following experiment section.

7

Page 8: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

3.4 Implementation details

Our backbone network derives from StarGAN [10], butdifferent from [10], we concatenate geometric guidanceand raw input to get a new input for both generator anddiscriminator. Besides, in order to capture domain la-bel effectively, we devise a new architecture for genera-tor. In our experiments, we find that the model performswell when we choose γ = η = 1 and δ = 10. Ourfinal loss is the sum of these losses as described in Eq.7, 10 and 12. We apply this final loss to the generatorwith Adam optimizer of learning rate 0.0002. Our codewill be made available at https://github.com/zhengziqiang/ReshapeGAN. All detailed imple-mentation could be found in our code.

4 Experiments and results

4.1 Evaluation metrics

Learned Perceptual Image Patch Similarity (LPIPS)is first proposed by [75], which computes the perceptualsimilarity (actually in terms of distance) between two im-age patches. Lower LPIPS means the two image patcheshave higher perceptual similarity. Considering two im-age domains, we can compute the LPIPS metric (averagedover sampled patch pairs) to evaluate the perceptual simi-larity between them.Frechet Inception Distance (FID) computes the simi-larity between the generated sample distribution and realdata distribution. This metric is consistent and robust forevaluating the quality of generated images [39, 4], and itcan be calculated by:

FID = ||µx − µg||22 + Tr(∑

x +∑

g −2(∑

x

∑g)

12

),

(13)where (µx,

∑x) and (µg,

∑g) are pairs of the mean and

covariance of the sample embeddings from the real datadistribution and generated data distribution, respectively.Lower FID means smaller distribution difference betweenthe generated and the target images and therefore higherquality of generated images.Geometrical Consistency is required for our object re-shaping tasks to evaluate the matching level geometri-cally. In order to know whether the model can synthesize

images with desired geometric information, we use avail-able state-of-the-art geometry estimation model to extractthe geometric information (such as landmarks and poses)from both the synthesized images and the target images,and compare the results from sample pairs in the shapespace. For facial expression generation, we use dlib [26]to extract landmarks and measure the similarity betweentwo images by computing SSIM (see Traditional Evalu-ation Metrics below for details) and LPIPS scores usingthe landmark information, marked as SSIM (Landmark)and LPIPS (Landmark) respectively.Identification Distance is important for our object re-shaping tasks to evaluate whether the appearance andidentity information are preserved while the objects arereshaped. We adopt object instance recognition (i.e., iden-tification) algorithms to evaluate the similarity or distancebetween generated images and input images. In the caseof faces, we use an effective open-source face recogni-tion algorithm1 to do that. A significantly large distanceaccording to the classifier, i.e., more than the normal cut-off 0.6, indicates that the two faces are from two differentidentities. That is, if the distance between two faces islower than 0.6, we can consider that the two faces havethe same identity. Moreover, theoretically, for the dis-tance below 0.6, larger distance might indicate better re-shaping with preserved identity, while too small distancemay mean that the model fails to do reshaping so that thegenerated image is almost identical to the input image.This can also be proved by computing the identificationdistances on RaFD [31] dataset, where each identity has8 different emotional expressions captured from 5 differ-ent angles/viewpoints (0◦,±45◦ and±90◦), and the aver-age distance between the neutral faces and the other emo-tional expressions from 0◦ viewpoint of all identities isabout 0.3, while, the average distance between the neutralfaces from 0◦ viewpoint and the other emotional expres-sions from ±45◦ viewpoint is about 0.5, and the averagedistance between the neutral faces from 0◦ viewpoint andthe other emotional expressions from ±90◦ viewpoint isabout 0.6.User Study (Identity / Shape) is the human evaluationapplicable for all the generated images. We also use thisgolden standard for our object reshaping tasks. Here weask 20 volunteers (users) to give a True/False judgement

1https://github.com/ageitgey/face_recognition

8

Page 9: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

on the output images. Specifically, we give the user aninput image, a reference image and a synthesized image,and ask them to judge true or false whether the synthe-sized image keep the identity information (Identity) andwhether the synthesized image has the same shape ex-pression with the reference image (Shape). The users aregiven unlimited time to make the decision. For each com-parison, we randomly generate 100 images and each im-age is judged by at least 2 different users. Only higheridentity metric or only higher shape metric can not pro-vide confident performance for our object reshaping tasks,since they always require both identity preservation andreshaping ability. So higher votes to both same identityand consistent shape could indicate better reshaping per-formance.Traditional Evaluation Metrics. For the cases wherepaired data exists, we can use some traditional imagequality assessment metrics for performance evaluation,including MSE, RMSE, PSNR and SSIM. MSE (MeanSquare Error) and RMSE (Root Mean Square Error) [67]compute pixel-wise errors between synthesized imagesand real images, lower MSE and RMSE represent higherimage generation quality. PSNR (Peak Signal-to-NoiseRatio) [21] can roughly evaluate the image quality inde-pendently, and usually the higher PSNR the better. SSIM(Structural Similarity Index Measure) [21] measures thesimilarity between two images, and higher SSIM denoteshigher structural similarity between generated images andreal images.

4.2 Reshaping by within-domain guidancewith paired data

4.2.1 Facial expression generation

Table 1: Quantitative comparison of facial expressiongeneration on KDEF dataset. The symbol ↑ (↓) indicatesthat the larger (smaller) the value, the better the perfor-mance.

Method SSIM↑ PSNR↑MSE↓ RMSE↓LPIPS↓ FID↓Pix2pix [24]0.8673 18.01380.0160 0.1133 0.2603 140.7901PG2 [42] 0.9118 19.24690.0122 0.0946 0.2032 97.2435ReshapeGAN0.9227 19.73740.0123 0.0949 0.1808 84.1437

Table 2: Quantitative comparison of facial expressiongeneration on RaFD dataset. The symbol ↑ (↓) indicatesthat the larger (smaller) the value, the better the perfor-mance.

Method SSIM↑ PSNR↑MSE↓ RMSE↓LPIPS↓ FID↓Pix2pix [24]0.9565 19.07940.0130 0.1121 0.1379 105.0758PG2 [42] 0.9631 19.61090.0115 0.1055 0.1297 102.6674ReshapeGAN0.9641 20.10080.0103 0.0998 0.1030 49.2096

Input Surprised Sad Happy AfraidDisgustedAngry

ReshapeGAN

Pix2pix

Target

PG2<latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit>

ShapeIdentity

Shape

Figure 6: Visual comparison of facial expression genera-tion on KDEF dataset.

Input

ReshapeGAN

Pix2pix

Target

Surprised Sad Happy Fearful Disgusted

Contemptuous

Angry

PG2<latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit>

ShapeIdentity

Shape

Figure 7: Visual comparison of facial expression genera-tion on RaFD dataset.

First, to evaluate the image generation performance forreshaping, we conduct experiments on facial expression

9

Page 10: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

datasets using KDEF [5] and RaFD [31] datasets. Forthese two facial datasets, we use paired data to train allthe models. KDEF contains 70 different identities with 7different emotional representations and 5 different poses,while RaFD contains 67 identities with 8 different emo-tional expressions, 5 different poses and 3 eye gaze direc-tions. For both of the two emotional datasets, we only usethe frontal images with neutral faces as input images andgenerate images with other facial expressions. So we re-organize the two datasets, thus make KDEF (420 totally,336 for training and 84 for evaluating) and RaFD (469totally, 392 for training and 77 for evaluating) for facialexpression reshaping.

Here we compare our ReshapeGAN with two state-of-the-art supervised methods, i.e., Pix2pix [24] andPG2 [42] for evaluation, and the quantitative comparisonresults can be found in Table 1 and Table 2, in terms ofSSIM, PSNR, MSE, RMSE, LPIPS and FID, where ourReshapeGAN gets the best performance among the threemethods. For ReshapeGAN and PG2, we use dlib [26]to obtain the geometric information as guidance. Whilefor Pix2pix, to compare fairly, we encode the emotionalor viewpoint expression as one-hot code and inject it intothe bottleneck of generator as guidance. More visual re-sults can be found in Fig. 6 and Fig. 7. As it can be seen,Pix2pix can not generate acceptable results, while the out-puts of PG2 have blur boundary and some dirty colorblocks. Our ReshapeGAN generates reasonably clear out-puts which preserve the identity information of inputs.

To investigate the efficiency of different components inour approach, we design additional experiments on RaFDfor ablation study, and the quantitative results are listed inTable 3. It can be seen that the perturbed loss Lperturb

improves the PSNR score dramatically, and the geomet-ric information helps to reduce the LPIPS distance, whilethe perceptual loss Lpercep reduces the LPIPS and FID,since the perceptual loss actually provides multi-scaleconstraints to the generator by using a cascade architec-ture.

4.2.2 Viewpoint transfer

Then, we devise viewpoint transfer task, aiming to gen-erate different pose (viewpoint) faces on FEI dataset [61],which contains 200 identities with 11 different pose direc-tions. Since we can not extract the accurate geometric in-

Table 3: Quantitative comparison for ablation study ofour ReshapeGAN on facial expression generation fromRaFD dataset. The symbol ↑ (↓) indicates that the larger(smaller) the value, the better the performance.

Method SSIM↑ PSNR↑MSE↓ RMSE↓LPIPS↓ FID↓Backbone 0.9632 19.89620.0106 0.1017 0.1365 106.0801Geometry 0.9655 19.79400.0109 0.1030 0.1242 89.5522Geometry+Lperturb

0.9636 20.13680.0101 0.0991 0.1241 81.6016

Geometry+Lpercep

0.9628 19.51270.0116 0.1064 0.1184 61.8619

ReshapeGAN0.9641 20.10080.0103 0.0998 0.1030 49.2096

Table 4: Quantitative comparison of viewpoint transfer(facial pose translation) on FEI dataset. The symbol ↑ (↓)indicates that the larger (smaller) the value, the better theperformance.

Method SSIM↑ PSNR↑MSE↓ RMSE↓LPIPS↓ FID↓Pix2pix [24] 0.9274 16.93580.016190.1228 0.2236 79.3540

DR-GAN [62]

0.9384 17.49950.013850.1138 0.2001 83.5483

PG2 [42] 0.9354 17.33660.014470.1160 0.1887 70.7044ReshapeGAN0.9578 19.22990.009700.0941 0.1721 73.5864

ReshapeGAN

Pix2pix

Target

DRGAN

Input Pose outputs

PG2<latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit><latexit sha1_base64="/EP2W0UvO/dX6Wgvdh7c7RXu3SU=">AAAB8nicbVDLSgNBEJz1GeMr6tHLYBA8hd0g6DHgQY8RzAM2a5id9CZDZmeXmV4xLAF/wosHRbz6Nd78GyePgyYWNBRV3XR3hakUBl3321lZXVvf2CxsFbd3dvf2SweHTZNkmkODJzLR7ZAZkEJBAwVKaKcaWBxKaIXDq4nfegBtRKLucJRCELO+EpHgDK3kdxAeMa9fj++r3VLZrbhT0GXizUmZzFHvlr46vYRnMSjkkhnje26KQc40Ci5hXOxkBlLGh6wPvqWKxWCCfHrymJ5apUejRNtSSKfq74mcxcaM4tB2xgwHZtGbiP95fobRZZALlWYIis8WRZmkmNDJ/7QnNHCUI0sY18LeSvmAacbRplS0IXiLLy+TZrXiuRXv9rxcaz7N4iiQY3JCzohHLkiN3JA6aRBOEvJMXsmbg86L8+58zFpXnHmER+QPnM8fVb+RvA==</latexit>

ShapeIdentity

Shape

Figure 8: Visual comparison of viewpoint transfer (facialpose translation) on FEI dataset.

formation from the full left or full right images, we don’tuse them for training. We use frontal images as inputs to

10

Page 11: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

generate other 8 pose images. We compare our Reshape-GAN with Pix2pix [24], PG2 [42] and DR-GAN [62]. ForReshapeGAN, Pix2pix and PG2, we also use the samesettings as facial expression generation (Section 4.2.1)for adding guidance to these three methods, and for DR-GAN, we use the default setting of facial pose generationfrom the authors but using our training data. Besides, weuse 160 identities for training and other 40 for testing.The quantitative comparison results are listed in Table 4.Our ReshapeGAN gets the lowest LPIPS and FID valuesamong the four methods. Visual synthesized results areshown in Fig. 8 for comparison. Pix2pix generates im-ages with blur boundary and some dirty color blocks, DR-GAN and PG2 distort the faces, while our ReshapeGANsynthesizes better results with detailed parts (such as eyes,mouth and nose) as well as reasonable faces. By com-bining the geometric information, our ReshapeGAN caneasily capture the relationships between different parts.

Note that, the geometric information does not limit tolandmarks, instance maps and pose skeletons. We canalso extract any other accurate geometric information andregard them as additional guidance to obtain better results.

4.3 Reshaping by within-domain guidancewith unpaired data

4.3.1 Controllable face translation on CelebA

For this task, we aim to achieve the controllable facialreshaping using unpaired data. We conduct the experi-ments on CelebA dataset [38], which contains approxi-mately 200 thousand images with high diversity. To getprecise facial expression, we reorganize this dataset forgetting a sub-dataset using dlib [26] to achieve face detec-tion and crop the detected face regions then resize themto 256 × 256. We use 157,619 images for training and39,404 images for testing. In theory, our method couldgenerate 39404 × 39404 images on testing stage. Con-sidering the huge consumption of computing resources,we only generate 1,000 images with different geometricrepresentation for one input sample.

Fig. 9 shows that our ReshapeGAN can synthesize 25plausible images, via 5 different input images, where eachrow has the appearance and content consistency, whileeach column has geometric consistency. It can also beseen that, ReshapeGAN can generate images with re-

quired geometric representation by providing a geometricguidance, even if there are no paired samples for training,indicating that ReshapeGAN learns the facial expressionsfrom the entire dataset rather than some specific sample.Moreover, Fig. 10 exhibits the random 96 generated im-ages using different geometric guidance. Note the smallerimage framed by red box in the lower right corner of everygenerated image is the given reference image (for provid-ing geometric information). The synthesized images pre-serve the appearance and identity of input images, and ourReshapeGAN is able to generate corresponding outputswith emotional and layout/pose information according tothe given reference images.

Input images

Reference images

Geometry consistency

Appearance consistency

Figure 9: The 5 × 5 outputs by our ReshapeGAN usingrandom 5 images as both inputs and references on CelebAdataset. Images framed by dotted lines are input imagesand references images, and each row shows images withthe identical appearance (e.g., the row framed by blackline) while each column exhibits images with same ge-ometric representation (e.g., the column framed by redline).

To our knowledge, we are the first to achieve arbitraryidentity reshaping using unpaired data, with only lim-ited images from each identity. Some traditional meth-ods could perform face swapping, while Deepfakes2 onlytranslates one identity to another identity using abundant

2https://github.com/deepfakes/Faceswap

11

Page 12: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Figure 10: Random 96 synthesized images by our Re-shapeGAN from one input sample on CelebA dataset.The middle enlarged image shows the source input im-age. The smaller images framed by red box in the lowerright corner of every generated image is the given refer-ence image. The synthesized images have the same emo-tional and pose information with given reference imageswhile preserving the appearance and identity informationof input images. Please zoom in to see more details.

images from both source identity and target identity, butit fails to translate between multiple identities. Thus, wechoose Faceswap from OpenCV3) as traditional methodfor comparison. Besides, we also make comparison usingsome popular image-to-image translation methods. Dueto the lack of paired data, we conduct experiments us-ing Pix2pix and PG2 by removing the pixel-level loss be-tween outputs and ground truth (denote as Pix2pix-- andPG2--) respectively. Here we regard the geometric in-formation as the conditional information and provide thegeometric information to generator by concatenating theraw input and geometry, through this way, we conduct ex-periments using StarGAN [10], and in consideration of

3https://opencv.org

that there is only one domain in CelebA dataset, we re-move the classification loss of StarGAN (denote as Star-GAN--).

The quantitative comparison is given in Table 5. SSIMand LPIPS scores computed by landmarks show the geo-metric consistency, higher SSIM and lower LPIPS scoresindicate better geometric consistency with reference im-ages. FID computes the distance between generated sam-ple distribution and real reference images, and lower FIDscore represents better image generation quality. Our Re-shapeGAN performs best in terms of SSIM, LPIPS andFID. We also use average identification distance to evalu-ate the identity similarity, Faceswap fails to preserve theidentity with distance larger than 0.6, and the other meth-ods can keep the identity information while our Reshape-GAN performs better on reshaping with larger distance(but below 0.6). The user study of identity/shape judge-ment can also arrive at similar conclusion that our Re-shapeGAN works well considering both identity preser-vation and object reshaping.

Visual comparison of different methods can be foundin Fig. 11. As shown, Faceswap is limited to only borrowthe face appearance from target reference and mergingit with the corresponding region without considering therelationship between parts, and the output images seemimplausible and don’t preserve the identity informationof the inputs. Pix2pix-- and PG2-- fail to reshape theinput images and generate images with artifacts. AndStarGAN-- also fails to synthesize images with requiredshape. Compared to these methods, our ReshapeGANachieves face reshaping by providing one single referenceimage. The visual comparison of Fig. 11 provides thesame clues as quantitative comparison shown in Table. 5.To show the powerful performance of our ReshapeGAN,we exhibit more synthesized results in Fig. 12. We or-ganize these generated results sorted by progressive in-crease of identification distance with source input imagein Fig. 12(a), and the right image show the visualizationresults of identification distance, which shows the rela-tionship between the distance and the identity preserva-tion with object reshaping.

Based on the above synthesized results, we see that notall of the generated images can borrow the effective in-formation reasonably, some generated images have ex-tremely exaggerated regions. More interestingly, if thegeometric information of given reference image is from

12

Page 13: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Table 5: Quantitative comparison of controllable face translation on CelebA dataset. Higher SSIM and lower LPIPSscores represent better geometric matching with guided geometric information. Lower FID score means better imagequality. Identification distance larger than 0.6 shows different identities, while larger distance below 0.6 indicatesbetter reshaping. Higher votes to both identity and shape of user study indicate better reshaping performance. Pleaserefer to Section 4.1 for details about evaluation metrics.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance Identity / Shape (User Study)

Faceswap 0.6949 0.1775 111.7921 0.7230 0.025 / 0.605Pix2pix-- 0.6049 0.2561 338.4090 0.4419 0.771 / 0.093

PG2-- 0.6605 0.2097 194.1435 0.3369 1.0 / 0.102StarGAN-- 0.6544 0.2156 157.4540 0.2576 1.0 / 0.075

ReshapeGAN 0.8332 0.0932 110.7313 0.5700 0.385 / 0.719

Faceswap

ReshapeGAN

Input Outputs

Reference

StarGAN--

Pix2pix--

PG2<latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit><latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit><latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit><latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit>

--

Identity

Shape

Figure 11: Visual comparison of controllable face trans-lation using different methods on CelebA dataset.

a man while the input image depicts a woman, there is achance that some sort of masculine pattern will be gener-ated. We guess that the provided geometric informationmay somehow carry the gender and age characteristics.If the pose distance between input image and given refer-ence image is too large, our model fails to generate plausi-ble outputs. To obtain more intuitive results, we computethe LPIPS distance between each reference image and theinput image, and visualize the results in Fig. 12(b). Herewe reorganize the generated images sorted by progressiveincrease of LPIPS distance between target reference im-

age and source input image. As it can be seen, Reshape-GAN performs well if the LPIPS is less than an implicitthreshold. However, when the LPIPS distance is too large,ReshapeGAN is not able to translate the input image tothe given target geometric space with limited prior infor-mation.

4.3.2 Controllable face translation on UTKFace

Then, to evaluate whether the geometric information car-ries underlying information such as gender and age char-acteristics, we conduct the face translation experimentson another large dataset UTKFace [77]. This dataset con-tains over 20,000 face images with long age span (rangefrom 0 to 116 years old). In this task, we also take ex-periments to generate images with arbitrary geometric in-formation. According to our above assumption, the geo-metric information may contain underlying scale, positionand facial expression information. By providing informa-tion to the generator and discriminator, they can build amapping function between input images and geometric in-formation. Visual results are shown in Fig. 13, we can seethat the generated images have a large range of ages. Wealso visualize the identification distance, from which, wecan see that the given geometric information does coversintrinsic characteristics. In order to fool the discrimina-tor, the generator should dig in the implicit informationand express it, reasonably.

For this task, we also conduct some comparative exper-iments using aforementioned methods. The comparativevisual results are shown in Fig. 14. We can see that the

13

Page 14: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Table 6: Quantitative comparison of controllable face translation on UTKFace dataset. Higher SSIM and lower LPIPSscores represent better geometric matching with guided geometric information. Lower FID score means higher imagequality. Identification distance larger than 0.6 shows different identities, while larger distance below 0.6 indicatesbetter reshaping. Higher votes to both identity and shape of user study indicate better reshaping performance. Pleaserefer to Section 4.1 for details about evaluation metrics.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance Identity / Shape (User Study)

Faceswap 0.5595 0.2404 120.09001 0.7354 0.045 / 0.807Pix2pix-- 0.6063 0.2197 251.4699 0.4144 0.786 / 0.188

PG2-- 0.5197 0.3127 173.5819 0.2628 1.0 / 0.024StarGAN-- 0.5501 0.2793 186.4931 0.1850 0.991 / 0.048

ReshapeGAN 0.8103 0.0849 103.8411 0.5902 0.494 / 0.850

synthesized images using Pix2pix--, PG2-- and Star-GAN-- do not achieve the facial reshaping. Comparingwith Faceswap, our ReshapeGAN can preserve the iden-tity information better. Quantitative comparison of dif-ferent methods are given in Table 6, as can be seen, ourReshapeGAN performs best with highest SSIM, lowestLPIPS and FID, indicating best geometric matching withhighest image quality. Besides, ReshapeGAN also getslargest identification distance compared to all other meth-ods below 0.6. Meanwhile, the reshaping ability withidentity preservation of ReshapeGAN can also be vali-dated by user study.

4.3.3 Controllable cat reshaping

Moreover, we implement our ReshapeGAN on an inter-esting application that is to reshape a cat by providing ge-ometric information. We use data and annotations fromthe cat head dataset [76], and each annotation file de-scribes the coordinates of 9 defined points (6 for ears, 2for eyes and 1 for mouth). Here we use 7,287 cropped im-ages (according to the coordinates) for training and 1,000images for testing. Experimental results are shown inFig. 15, our ReshapeGAN can reshape the cat image witha given geometric information, while keeping the detailedinformation of input image. For this task, we only providethe limited 9 pointed annotation, which describes the loca-tion of ears, eyes and mouth. So, without sufficient guid-ance, our method can also work well. For this task, weonly compute the FID between generated cat images andreal images for reference, and the FID score is 76.7446.

4.3.4 Controllable pose reshaping

In addition, the skeleton guidance of human body pro-vides more detailed pose information. Thus, we use poseestimation model OpenPose [6] to extract pose informa-tion of human images and regard them as geometric guid-ance for pose reshaping application. Here we perform ourReshapeGAN on Panama dataset4. Note that, we don’tuse the paired training for this task, and only add thecycle-consistency constrains to our model. By fusing poseinformation and appearance information from given refer-ence images, our method can generate reasonable resultsas shown in Fig. 16. Here we only generate images byframes, we leave the pose sequence generation by addingspatial and temporal constrains as our future work. Forthis task, we also only compute the FID between gener-ated pose images and real images for reference, and theFID score is 145.3365.

4.3.5 Ablation study

On the one hand, to investigate the architecture of ourgeometry-guided method, we compare three cases forgenerator and discriminator without or with geometric in-formation: 1) only add geometric information to genera-tor, 2) only add geometric information to discriminator, 3)add geometric information to both generator and discrimi-nator. We conduct experiments on CelebA dataset for thisablation study in the three cases, and the visual resultsare shown in Fig. 17(a). Discriminator without geometric

4https://github.com/llSourcell/Everybody_Dance_Now

14

Page 15: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Table 7: Quantitative comparison of different cases for our ReshapeGAN without or with geometric information onCelebA dataset. Only adding geometric information to generator G (Only G) and discriminator D (Only D) get verylow identification distance, showing that they fail to achieve geometric reshaping with given geometric constrain, thatis, they fall into the trivial solution to generate the very similar output to the input image (very lower identificationdistance). The user study also validates the same point (higher votes to both identity and shape of user study indicatebetter reshaping performance). Our ReshapeGAN, by adding geometric information to both G and D, gets higherSSIM and lower LPIPS scores, indicating to reshape the input image with given geometric information better, andalso gets the lower FID score representing higher image quality.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance Identity / Shape (User Study)

Only G 0.6763 0.1855 168.2893 0.0438 1.0 / 0.1739Only D 0.6543 0.1924 176.5256 0.0352 1.0 / 0.073

G+D (ReshapeGAN) 0.8332 0.0932 110.7313 0.5700 0.385 / 0.719

Table 8: Quantitative comparison of different cases for our ReshapeGAN without or with perceptual loss on CelebAdataset. Although the model without the perceptual loss can get better performance in terms of FID, LPIPS andSSIM, the identification distance is larger than 0.6. Thus the perceptual loss does helps to preserve the identity andappearance information from source input image.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance Identity / Shape (User Study)

w/o Lpercep 0.8683 0.0706 84.6713 0.6075 0.145 / 0.879w/ Lgr

percep 0.8509 0.0774 99.9562 0.6188 0.186 / 0.934w/ Lgi

percep 0.8332 0.0932 110.7313 0.5700 0.385 / 0.719

Table 9: Quantitative comparison of different cases for our ReshapeGAN without or with perturbed loss on CelebAdataset. The perturbed loss can reduce the identification distance and reserve the source identity information whilehelping improve image quality.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance Identity / Shape (User Study)

w/o Lperturb 0.8473 0.0827 115.4396 0.5790 0.256 / 0.896W/ Lperturb 0.8332 0.0932 110.7313 0.5700 0.385 / 0.719

information guidance captures the pose information butfails to provide effective gradient direction to generator,so that the model can not generate required controllableimages. If we don’t provide the given geometric informa-tion to generator, the model can not achieve controllablereshaping. Thus the guidance is necessary for both gener-ator and discriminator. Table 7 with quantitative compar-ison also concludes this point.

On the other hand, we explore the efficiency of percep-tual loss by considering three cases: 1) without perceptualloss, 2) with perceptual loss computed between synthe-

sized fake image and real reference image (Lgrpercep), 3)

with perceptual loss computed between synthesized fakeimage and real input image (Lgi

percep). Visual results canbe found in Fig. 17(b), and quantitative results are listedin Table 8. From the results, we can see that the modelfalls in trivial solution and encounters over-fitting prob-lem by adding constrains between generated images andgiven target reference images, that is, the generated im-ages are very similar to the given reference images. Mean-while, computing the perceptual loss between synthesizedimages and input images can guarantee content and iden-

15

Page 16: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Figure 12: (a) The random 100 generated results (left)sorted by progressive increase of identification distance(right) between the generated image and the source inputimage on CelebA dataset. (b) The random 100 generatedresults (left) sorted by progressive increase of LPIPS dis-tance (right) between the corresponding target referenceimage and the source input image on CelebA dataset. Wecan see that the generated image with lower identificationdistance has higher similarity, while the generated im-age with lower LPIPS distance has higher quality. Pleasezoom in to see more details.

tity consistency. Comparing with L1 loss, perceptual lossfocuses on high-level semantic matching between imagesrather than pixel-level matching.

Furthermore, we also devise similar experiments to ex-plore the effectiveness of the perturbed loss, the resultsare shown in Table 9 and Fig. 17(c). As shown, theperturbed loss helps to preserve the identity informationand improves the generation quality. Although the modelwithout perturbed loss can get a little higher SSIM andlower LPIPS scores (computed between geometric infor-mation), we should pay more attention to preserve the ap-pearance consistency and image generation quality.

Input

Figure 13: The random 100 generated results (left) sortedby progressive increase of identification distance (right)between the generated image and the source input imageon UTKFace dataset. Please zoom in to see more details.

Faceswap

ReshapeGAN

Reference

StarGAN--

Pix2pix--

Input Outputs

Identity

Shape

PG2<latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit><latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit><latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit><latexit sha1_base64="uBqqJxsXjQMP7Eau4i18P3ywB5Y=">AAAB9HicbZBLTwIxFIXv4Avxhbp00wgmrsgMG12SuNAlJvJIYCSdUqCh0xnbO0QyIfFfuHGhMW79Me78N5bHQsGTNPlyzm16e4JYCoOu++1k1tY3Nrey27md3b39g/zhUd1EiWa8xiIZ6WZADZdC8RoKlLwZa07DQPJGMLya5o0R10ZE6g7HMfdD2leiJxhFa/nFNvJHTKvXk/tysZMvuCV3JrIK3gIKsFC1k/9qdyOWhFwhk9SYlufG6KdUo2CST3LtxPCYsiHt85ZFRUNu/HS29IScWadLepG2RyGZub9vpDQ0ZhwGdjKkODDL2dT8L2sl2Lv0U6HiBLli84d6iSQYkWkDpCs0ZyjHFijTwu5K2IBqytD2lLMleMtfXoV6ueS5Je+2XKjUn+Z1ZOEETuEcPLiACtxAFWrA4AGe4RXenJHz4rw7H/PRjLOo8Bj+yPn8ARIwkhY=</latexit>

--

Figure 14: Visual comparison of controllable face trans-lation using different methods on UTKFace dataset.

4.4 Reshaping by cross-domain guidance

4.4.1 Facial reshaping across multiple datasets

In this task, we use 5 facial image datasets includingYale [18], WSEFEP [49], ADFES [63], KDEF [5] andRaFD [31] for facial reshaping.

First, considering domain gaps between differentdatasets, we devise three different experiments for eval-uating the efficiency of our ReshapeGAN. First, we con-duct experiments on RaFD and KDEF datasets, and we

16

Page 17: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Table 10: Quantitative comparison of controllable face translation on KDEF-RaFD dataset. Higher SSIM and lowerLPIPS scores represent better geometric matching with guided geometric information. Lower FID score means higherimage quality. Identification distance larger than 0.6 shows different identities, while larger distance below 0.6 indi-cates better reshaping. Please refer to Section 4.1 for details about evaluation metrics. The results above the dashedline are computed in the first case: the input image is from KDEF and the reference image is from RaFD, while theresults below the dashed line are computed in the second case: the input image is from RaFD and the reference imageis from KDEF.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance

CycleGAN [80] 0.6181 0.2608 172.8081 0.4591MUNIT [23] 0.6083 0.2754 182.4521 0.4721DRIT [35] 0.6949 0.2692 115.8099 0.5427

StarGAN [10] 0.6623 0.2493 130.7385 0.0612ReshapeGAN 0.7981 0.1184 100.8653 0.5909

CycleGAN [80] 0.6323 0.2372 140.2661 0.4602MUNIT [23] 0.6315 0.2482 154.2534 0.4654DRIT [35] 0.6521 0.1932 84.2044 0.6361

StarGAN [10] 0.6256 0.2503 100.9369 0.0489ReshapeGAN 0.7895 0.1222 122.9286 0.5930

choose CycleGAN, MUNIT, DRIT and StarGAN forcomparison. In order to compare fairly, we provide thegeometric information to all the generators as the addi-tional constrains for all compared methods. Additionally,for StarGAN, we use the one-hot encoding to perform thefacial translation on the two different datasets. By com-bining all the emotional categories appearing on both twodatasets, we get 7 different emotional labels (contemp-tuous, disgusted, fearful, happy, sad, and surprised) andconduct experiments following [10]. For the testing stage,we randomly select 91 input-reference pairs from the test-ing images of the two domains. Please note that there aretwo cases for the cross-domain reshaping on KDEF-RaFDdatasets: 1) the input image is from KDEF and the refer-ence is from RaFD, 2) the input image is from RaFD andthe reference image is from KDEF. The visual synthesizedimages are exhibited in Fig. 18 and Fig. 19(a). CycleGANand MUNIT only achieve the domain adaption accordingto the given reference image, and they fail to perform re-shaping by combining the geometric information. DRITgenerates images with underlying position changes, butit also fails to reshape input images with given referenceimages. Above all, recent unpaired cross-domain meth-ods among two different domains (CycleGAN, MUNIT

and DRIT) can achieve domain adaption (including thelow-level texture and background translation), but theypay more attention to the low-level matching rather thanthe high-level geometrical matching, that is, they fail togenerate images with required shape by providing the ge-ometric guidance. StarGAN fails to achieve the facialtranslation by providing additional emotional label, weguess the reason is that the same two datasets containsimilar emotional expression but different background in-formation and data distribution, so that the model is hardto focus on the emotional representation. Compared toabove methods, our ReshapeGAN can achieve the fa-cial reshaping by only providing a single reference im-age. For the quantitative comparison, Table 10 lists allthe results. As shown, our method gets highest SSIM andlowest LPIPS scores, which indicates that ReshapeGANachieves geometric consistency with reference images. Atthe same time, our method can preserve the appearanceinformation (including the background and domain infor-mation) well in terms of identification distance.

Second, we perform our method on WSEFEP, ADFESand Yale datasets, where the three datasets have smallintra-domain distances, and the visual results are shown inFig. 19(b). For the above two cases, our method can per-

17

Page 18: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Figure 15: Random 96 synthesized images by our Re-shapeGAN from one input sample on cat head dataset.The middle enlarged image shows the source input image.And the smaller image framed by red box in the lowerright corner of every generated image is the given targetreference image. Please zoom in to see more details.

form well on both object reshaping and generation quality.Finally, we conduct experiments on all above five emo-

tional datasets, and the visual results can be seen inFig. 19(c). We observe that the synthesized images havedirty color blocks if the domain gap is large (such asKDEF and WSEFEP), since the samples from the twodatasets have extremely different location as well as poseinformation, such that our method can not generate rea-sonable outputs without sufficient prior information.

4.4.2 Caricature reshaping across multiple datasets

For this task, we aim to achieve caricature translationbetween multiple datasets. The datasets include CUHKFace Sketch database [65], KDEF [5], IIIT-CFW [45] andPHOTO-SKETCH [66]. Among the four datasets, wecould get 5 different image style domains (there are 2 do-mains from PHOTO-SKETCH dataset). For each dataset,

Figure 16: Random 96 synthesized images by our Re-shapeGAN from one input sample on Panama dataset.The middle enlarged image shows the source input image.And the smaller image framed by red box in the lowerright corner of every generated image is the given posereference image extracted using OpenPose. We couldgenerate a different video sequence with poses from targetidentity while using appearance and identity informationfrom source inputs. Please zoom in to see more details.

we randomly select approximately four fifths of samplesfor training and others for testing. First, we conduct com-parative experiments on PHOTO-SKETCH dataset. Dueto the lack of conditional label, here we don’t use Star-GAN for comparison and follow the training/testing split-ting in [71]. Using the same testing criteria, we randomlyget 199 input-reference images. Note that the input-reference paired images are not from the same identity.The visual synthesized results can be seen in Fig. 20. Asshown, our ReshapeGAN can reshape the input imagesaccording to the shape of reference images, while Cycle-GAN, MUNIT and DRIT only achieve the domain trans-fer between two different image manifolds, failing to findthe geometric mapping function. The quantitative com-parison is listed in Table 11, from which, our Reshape-

18

Page 19: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Only GRef.Input Only D G+D Input Ref. Input Ref.

(a) (b) (c)

w/o Lpercep<latexit sha1_base64="wGuouAxYwHx8RxEDQ0PZfhe1gR4=">AAACA3icbVC7TsMwFHXKq5RXgA0WixaJqSRdYKzEwoBQkehDaqPIcZ3WqmNHtgOqokgs/AoLAwix8hNs/A1OmwEKR7J0dM691/eeIGZUacf5skpLyyura+X1ysbm1vaOvbvXUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0mF7nfvSNSUcFv9TQmXoRGnIYUI20k3z64PxWwNoiQHmPE0qvMT2NiRsdZzberTt2ZAf4lbkGqoEDLtz8HQ4GTiHCNGVKq7zqx9lIkNcWMZJVBokiM8ASNSN9QjiKivHR2QwaPjTKEoZDmcQ1n6s+OFEVKTaPAVObLqkUvF//z+okOz72U8jjRhOP5R2HCoBYwDwQOqSRYs6khCEtqdoV4jCTC2sRWMSG4iyf/JZ1G3XXq7k2j2rwu4iiDQ3AEToALzkATXIIWaAMMHsATeAGv1qP1bL1Z7/PSklX07INfsD6+AZo9l4Y=</latexit><latexit sha1_base64="wGuouAxYwHx8RxEDQ0PZfhe1gR4=">AAACA3icbVC7TsMwFHXKq5RXgA0WixaJqSRdYKzEwoBQkehDaqPIcZ3WqmNHtgOqokgs/AoLAwix8hNs/A1OmwEKR7J0dM691/eeIGZUacf5skpLyyura+X1ysbm1vaOvbvXUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0mF7nfvSNSUcFv9TQmXoRGnIYUI20k3z64PxWwNoiQHmPE0qvMT2NiRsdZzberTt2ZAf4lbkGqoEDLtz8HQ4GTiHCNGVKq7zqx9lIkNcWMZJVBokiM8ASNSN9QjiKivHR2QwaPjTKEoZDmcQ1n6s+OFEVKTaPAVObLqkUvF//z+okOz72U8jjRhOP5R2HCoBYwDwQOqSRYs6khCEtqdoV4jCTC2sRWMSG4iyf/JZ1G3XXq7k2j2rwu4iiDQ3AEToALzkATXIIWaAMMHsATeAGv1qP1bL1Z7/PSklX07INfsD6+AZo9l4Y=</latexit><latexit sha1_base64="wGuouAxYwHx8RxEDQ0PZfhe1gR4=">AAACA3icbVC7TsMwFHXKq5RXgA0WixaJqSRdYKzEwoBQkehDaqPIcZ3WqmNHtgOqokgs/AoLAwix8hNs/A1OmwEKR7J0dM691/eeIGZUacf5skpLyyura+X1ysbm1vaOvbvXUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0mF7nfvSNSUcFv9TQmXoRGnIYUI20k3z64PxWwNoiQHmPE0qvMT2NiRsdZzberTt2ZAf4lbkGqoEDLtz8HQ4GTiHCNGVKq7zqx9lIkNcWMZJVBokiM8ASNSN9QjiKivHR2QwaPjTKEoZDmcQ1n6s+OFEVKTaPAVObLqkUvF//z+okOz72U8jjRhOP5R2HCoBYwDwQOqSRYs6khCEtqdoV4jCTC2sRWMSG4iyf/JZ1G3XXq7k2j2rwu4iiDQ3AEToALzkATXIIWaAMMHsATeAGv1qP1bL1Z7/PSklX07INfsD6+AZo9l4Y=</latexit><latexit sha1_base64="wGuouAxYwHx8RxEDQ0PZfhe1gR4=">AAACA3icbVC7TsMwFHXKq5RXgA0WixaJqSRdYKzEwoBQkehDaqPIcZ3WqmNHtgOqokgs/AoLAwix8hNs/A1OmwEKR7J0dM691/eeIGZUacf5skpLyyura+X1ysbm1vaOvbvXUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0mF7nfvSNSUcFv9TQmXoRGnIYUI20k3z64PxWwNoiQHmPE0qvMT2NiRsdZzberTt2ZAf4lbkGqoEDLtz8HQ4GTiHCNGVKq7zqx9lIkNcWMZJVBokiM8ASNSN9QjiKivHR2QwaPjTKEoZDmcQ1n6s+OFEVKTaPAVObLqkUvF//z+okOz72U8jjRhOP5R2HCoBYwDwQOqSRYs6khCEtqdoV4jCTC2sRWMSG4iyf/JZ1G3XXq7k2j2rwu4iiDQ3AEToALzkATXIIWaAMMHsATeAGv1qP1bL1Z7/PSklX07INfsD6+AZo9l4Y=</latexit>

w/ Lgrpercep

<latexit sha1_base64="hz5LIGfOeqrixHkmHsnJZuUINRo=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swcyq/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifPwA0mW4=</latexit><latexit sha1_base64="hz5LIGfOeqrixHkmHsnJZuUINRo=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swcyq/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifPwA0mW4=</latexit><latexit sha1_base64="hz5LIGfOeqrixHkmHsnJZuUINRo=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swcyq/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifPwA0mW4=</latexit><latexit sha1_base64="hz5LIGfOeqrixHkmHsnJZuUINRo=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swcyq/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifPwA0mW4=</latexit>

w/ Lgipercep

<latexit sha1_base64="M2SydjL2OatQ02+Gh4pzHVi4pTI=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swc0q/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifP/JvmWU=</latexit><latexit sha1_base64="M2SydjL2OatQ02+Gh4pzHVi4pTI=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swc0q/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifP/JvmWU=</latexit><latexit sha1_base64="M2SydjL2OatQ02+Gh4pzHVi4pTI=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swc0q/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifP/JvmWU=</latexit><latexit sha1_base64="M2SydjL2OatQ02+Gh4pzHVi4pTI=">AAACB3icbVC7TsMwFHXKq5RXgBEJWbRITCXpAmMlFgaEikQfUhsix3Vaq45j2Q6oirKx8CssDCDEyi+w8Tc4bQdoOZKlo3Puvb73BIJRpR3n2yosLa+srhXXSxubW9s79u5eS8WJxKSJYxbLToAUYZSTpqaakY6QBEUBI+1gdJH77XsiFY35rR4L4kVowGlIMdJG8u3Dh1NY6UVIDzFi6VXmp4KYySK7Swc0q/h22ak6E8BF4s5IGczQ8O2vXj/GSUS4xgwp1XUdob0USU0xI1mplygiEB6hAekaylFElJdO7sjgsVH6MIyleVzDifq7I0WRUuMoMJX5xmrey8X/vG6iw3MvpVwkmnA8/ShMGNQxzEOBfSoJ1mxsCMKSml0hHiKJsDbRlUwI7vzJi6RVq7pO1b2plevXsziK4AAcgRPggjNQB5egAZoAg0fwDF7Bm/VkvVjv1se0tGDNevbBH1ifP/JvmWU=</latexit>

w/o Lperturb<latexit sha1_base64="vGuzNxQ7CoUhBDcTfWfeEMGA2MM=">AAACBHicbVC7TsMwFHV4lvIKMHaxaJGYStIFxkosDAgViT6kNooc12mtOnZkO6AqysDCr7AwgBArH8HG3+C0GaDlSFc6Oude3XtPEDOqtON8Wyura+sbm6Wt8vbO7t6+fXDYUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0ml7nfvSdSUcHv9DQmXoRGnIYUI20k3648nAlYG0RIjzFi6XXmpzGROpFBVvPtqlN3ZoDLxC1IFRRo+fbXYChwEhGuMUNK9V0n1l6KpKaYkaw8SBSJEZ6gEekbylFElJfOnsjgiVGGMBTSFNdwpv6eSFGk1DQKTGd+rVr0cvE/r5/o8MJLKY8TTTieLwoTBrWAeSJwSCXBmk0NQVhScyvEYyQR1ia3sgnBXXx5mXQaddepu7eNavOmiKMEKuAYnAIXnIMmuAIt0AYYPIJn8ArerCfrxXq3PuatK1YxcwT+wPr8AZOgmBU=</latexit><latexit sha1_base64="vGuzNxQ7CoUhBDcTfWfeEMGA2MM=">AAACBHicbVC7TsMwFHV4lvIKMHaxaJGYStIFxkosDAgViT6kNooc12mtOnZkO6AqysDCr7AwgBArH8HG3+C0GaDlSFc6Oude3XtPEDOqtON8Wyura+sbm6Wt8vbO7t6+fXDYUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0ml7nfvSdSUcHv9DQmXoRGnIYUI20k3648nAlYG0RIjzFi6XXmpzGROpFBVvPtqlN3ZoDLxC1IFRRo+fbXYChwEhGuMUNK9V0n1l6KpKaYkaw8SBSJEZ6gEekbylFElJfOnsjgiVGGMBTSFNdwpv6eSFGk1DQKTGd+rVr0cvE/r5/o8MJLKY8TTTieLwoTBrWAeSJwSCXBmk0NQVhScyvEYyQR1ia3sgnBXXx5mXQaddepu7eNavOmiKMEKuAYnAIXnIMmuAIt0AYYPIJn8ArerCfrxXq3PuatK1YxcwT+wPr8AZOgmBU=</latexit><latexit sha1_base64="vGuzNxQ7CoUhBDcTfWfeEMGA2MM=">AAACBHicbVC7TsMwFHV4lvIKMHaxaJGYStIFxkosDAgViT6kNooc12mtOnZkO6AqysDCr7AwgBArH8HG3+C0GaDlSFc6Oude3XtPEDOqtON8Wyura+sbm6Wt8vbO7t6+fXDYUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0ml7nfvSdSUcHv9DQmXoRGnIYUI20k3648nAlYG0RIjzFi6XXmpzGROpFBVvPtqlN3ZoDLxC1IFRRo+fbXYChwEhGuMUNK9V0n1l6KpKaYkaw8SBSJEZ6gEekbylFElJfOnsjgiVGGMBTSFNdwpv6eSFGk1DQKTGd+rVr0cvE/r5/o8MJLKY8TTTieLwoTBrWAeSJwSCXBmk0NQVhScyvEYyQR1ia3sgnBXXx5mXQaddepu7eNavOmiKMEKuAYnAIXnIMmuAIt0AYYPIJn8ArerCfrxXq3PuatK1YxcwT+wPr8AZOgmBU=</latexit><latexit sha1_base64="vGuzNxQ7CoUhBDcTfWfeEMGA2MM=">AAACBHicbVC7TsMwFHV4lvIKMHaxaJGYStIFxkosDAgViT6kNooc12mtOnZkO6AqysDCr7AwgBArH8HG3+C0GaDlSFc6Oude3XtPEDOqtON8Wyura+sbm6Wt8vbO7t6+fXDYUSKRmLSxYEL2AqQIo5y0NdWM9GJJUBQw0g0ml7nfvSdSUcHv9DQmXoRGnIYUI20k3648nAlYG0RIjzFi6XXmpzGROpFBVvPtqlN3ZoDLxC1IFRRo+fbXYChwEhGuMUNK9V0n1l6KpKaYkaw8SBSJEZ6gEekbylFElJfOnsjgiVGGMBTSFNdwpv6eSFGk1DQKTGd+rVr0cvE/r5/o8MJLKY8TTTieLwoTBrWAeSJwSCXBmk0NQVhScyvEYyQR1ia3sgnBXXx5mXQaddepu7eNavOmiKMEKuAYnAIXnIMmuAIt0AYYPIJn8ArerCfrxXq3PuatK1YxcwT+wPr8AZOgmBU=</latexit>

w/ Lperturb<latexit sha1_base64="ME7uJKeVzqGdB4jIrMFNA3YYYSg=">AAACA3icbVDLSsNAFJ3UV62vqDvdDLaCq5p0o8uCGxciFewD2hAm05t26OTBzEQpIeDGX3HjQhG3/oQ7/8ZJm4VWD1w4nHMv997jxZxJZVlfRmlpeWV1rbxe2djc2t4xd/c6MkoEhTaNeCR6HpHAWQhtxRSHXiyABB6Hrje5yP3uHQjJovBWTWNwAjIKmc8oUVpyzYP7U1wbBESNKeHpVeamMQiVCC+ruWbVqlsz4L/ELkgVFWi55udgGNEkgFBRTqTs21asnJQIxSiHrDJIJMSETsgI+pqGJADppLMfMnyslSH2I6ErVHim/pxISSDlNPB0Z36tXPRy8T+vnyj/3ElZGCcKQjpf5CccqwjngeAhE0AVn2pCqGD6VkzHRBCqdGwVHYK9+PJf0mnUbatu3zSqzesijjI6REfoBNnoDDXRJWqhNqLoAT2hF/RqPBrPxpvxPm8tGcXMPvoF4+MbuxKXnA==</latexit><latexit sha1_base64="ME7uJKeVzqGdB4jIrMFNA3YYYSg=">AAACA3icbVDLSsNAFJ3UV62vqDvdDLaCq5p0o8uCGxciFewD2hAm05t26OTBzEQpIeDGX3HjQhG3/oQ7/8ZJm4VWD1w4nHMv997jxZxJZVlfRmlpeWV1rbxe2djc2t4xd/c6MkoEhTaNeCR6HpHAWQhtxRSHXiyABB6Hrje5yP3uHQjJovBWTWNwAjIKmc8oUVpyzYP7U1wbBESNKeHpVeamMQiVCC+ruWbVqlsz4L/ELkgVFWi55udgGNEkgFBRTqTs21asnJQIxSiHrDJIJMSETsgI+pqGJADppLMfMnyslSH2I6ErVHim/pxISSDlNPB0Z36tXPRy8T+vnyj/3ElZGCcKQjpf5CccqwjngeAhE0AVn2pCqGD6VkzHRBCqdGwVHYK9+PJf0mnUbatu3zSqzesijjI6REfoBNnoDDXRJWqhNqLoAT2hF/RqPBrPxpvxPm8tGcXMPvoF4+MbuxKXnA==</latexit><latexit sha1_base64="ME7uJKeVzqGdB4jIrMFNA3YYYSg=">AAACA3icbVDLSsNAFJ3UV62vqDvdDLaCq5p0o8uCGxciFewD2hAm05t26OTBzEQpIeDGX3HjQhG3/oQ7/8ZJm4VWD1w4nHMv997jxZxJZVlfRmlpeWV1rbxe2djc2t4xd/c6MkoEhTaNeCR6HpHAWQhtxRSHXiyABB6Hrje5yP3uHQjJovBWTWNwAjIKmc8oUVpyzYP7U1wbBESNKeHpVeamMQiVCC+ruWbVqlsz4L/ELkgVFWi55udgGNEkgFBRTqTs21asnJQIxSiHrDJIJMSETsgI+pqGJADppLMfMnyslSH2I6ErVHim/pxISSDlNPB0Z36tXPRy8T+vnyj/3ElZGCcKQjpf5CccqwjngeAhE0AVn2pCqGD6VkzHRBCqdGwVHYK9+PJf0mnUbatu3zSqzesijjI6REfoBNnoDDXRJWqhNqLoAT2hF/RqPBrPxpvxPm8tGcXMPvoF4+MbuxKXnA==</latexit><latexit sha1_base64="ME7uJKeVzqGdB4jIrMFNA3YYYSg=">AAACA3icbVDLSsNAFJ3UV62vqDvdDLaCq5p0o8uCGxciFewD2hAm05t26OTBzEQpIeDGX3HjQhG3/oQ7/8ZJm4VWD1w4nHMv997jxZxJZVlfRmlpeWV1rbxe2djc2t4xd/c6MkoEhTaNeCR6HpHAWQhtxRSHXiyABB6Hrje5yP3uHQjJovBWTWNwAjIKmc8oUVpyzYP7U1wbBESNKeHpVeamMQiVCC+ruWbVqlsz4L/ELkgVFWi55udgGNEkgFBRTqTs21asnJQIxSiHrDJIJMSETsgI+pqGJADppLMfMnyslSH2I6ErVHim/pxISSDlNPB0Z36tXPRy8T+vnyj/3ElZGCcKQjpf5CccqwjngeAhE0AVn2pCqGD6VkzHRBCqdGwVHYK9+PJf0mnUbatu3zSqzesijjI6REfoBNnoDDXRJWqhNqLoAT2hF/RqPBrPxpvxPm8tGcXMPvoF4+MbuxKXnA==</latexit>

Figure 17: Visual synthesized results of different cases without or with geometric information (a), of different caseswithout or with perceptual loss (b), of different cases without or with perturbed loss (c), on CelebA dataset.

Input Ref. MUNIT DRIT CycleGAN StarGAN ReshapeGAN

Figure 18: Visual comparison of reshaping by cross-domain guidance for using different methods on KDEF-RaFD dataset.

GAN can keep better geometric consistency with effec-tive identification distance. Although CycleGAN, MU-NIT and DRIT have lower FID scores by achieving do-main adaption, they all actually fail to reshape the inputimages according to the reference images and preserve theappearance and style information of input images.

Actually, in the testing stage, we can generate arbitrarydomain image with arbitrary geometric guidance, and wecan also generate n × n images based on n images from5 image domains for this task. To reduce the computa-tional cost, we randomly assemble 5 images from 5 im-age domains to obtain a combination. So we can getn = maxni combinations, where ni represents the num-

ADFES

WSEFEP

Yale

ReshaedInput

RaFD

KDEF

Input Reshaped

Input

ADFES

RaFD

KDEF

WSEFEP

Yale

Reshaped

(a)(b)

(c)

Ref.Ref.

Ref.

Identity

Shape

Identity

Shape

Identity

Shape

Figure 19: Visual results synthesized by our Reshape-GAN on RaFD and KDEF dataset (a), on WSEFEP, AD-FES, and Yale (b), on above five datasets (c), for facialreshaping.

ber of testing images from the 5 image domains sepa-rately. In order to evaluate the performance of our Re-

19

Page 20: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Table 11: Quantitative comparison of controllable face translation on PHOTO-SKETCH dataset. Higher SSIM andlower LPIPS scores represent better geometric matching with guided geometric information. Lower FID score meanshigher image quality. Identification distance larger than 0.6 shows different identities, while larger distance below0.6 indicates better reshaping. Please refer to Section 4.1 for details about evaluation metrics. The results above thedashed line are computed in the first case: the input image is from PHOTO domain and the reference image is fromSkETCH domain, while the results below the dashed line are computed in the second case: the input image is fromSKETCH domain and the reference image is from PHOTO domain.

Method SSIM (Landmark) LPIPS (Landmark) FID Identification Distance

CycleGAN [80] 0.4839 0.3567 45.6227 0.6320MUNIT [23] 0.4623 0.3462 89.6343 0.6103DRIT [35] 0.4823 0.3514 67.8397 0.6420

ReshapeGAN 0.8088 0.0871 125.4597 0.5749CycleGAN [80] 0.4786 0.3703 77.2058 0.6038

MUNIT [23] 0.4823 0.3643 84.2434 0.6243DRIT [35] 0.4927 0.3513 65.3231 0.6182

ReshapeGAN 0.7931 0.1123 101.5948 0.5818

Input Ref. MUNIT DRIT CycleGAN ReshapeGAN

Figure 20: Visual comparison of reshaping by cross-domain guidance for using different methods on PHOTO-SKETCH dataset.

shapeGAN on cross-domain reshaping, we compute theFID scores between generated images and real images.Here we compute the FID scores of 4 different cases: 1)between synthesized images without the domain classifi-cation loss Lcls(Dd) and reference images (Without CLSfor abbreviation), 2) between synthesized images opti-

Input Ref.

Input Ref.

(a)

(b)

w/o Lcls(Dd)<latexit sha1_base64="7cN2bmUrbR6EwtVRNDBxB+jhjzs=">AAACB3icbVDLSsNAFJ3UV62vqEtBBluhbmrSjS4LunAhUsE+oA1hMpm0QyeZMDNRSsjOjb/ixoUibv0Fd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda0ueCExamDMuuh6ShNGItBRVjHRjQVDoMdLxRhe537knQlIe3alxTJwQDSIaUIyUllzz8OGUw0o/RGqIEUuvMzfFTGbVSzf1s5OKa5atmjUBXCT2jJTBDE3X/Or7HCchiRRmSMqebcXKSZFQFDOSlfqJJDHCIzQgPU0jFBLppJM/MnisFR8GXOiKFJyovydSFEo5Dj3dmR8s571c/M/rJSo4d1IaxYkiEZ4uChIGFYd5KNCngmDFxpogLKi+FeIhEggrHV1Jh2DPv7xI2vWabdXs23q5cTOLowgOwBGoAhucgQa4Ak3QAhg8gmfwCt6MJ+PFeDc+pq0FYzazD/7A+PwB+C2YwQ==</latexit><latexit sha1_base64="7cN2bmUrbR6EwtVRNDBxB+jhjzs=">AAACB3icbVDLSsNAFJ3UV62vqEtBBluhbmrSjS4LunAhUsE+oA1hMpm0QyeZMDNRSsjOjb/ixoUibv0Fd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda0ueCExamDMuuh6ShNGItBRVjHRjQVDoMdLxRhe537knQlIe3alxTJwQDSIaUIyUllzz8OGUw0o/RGqIEUuvMzfFTGbVSzf1s5OKa5atmjUBXCT2jJTBDE3X/Or7HCchiRRmSMqebcXKSZFQFDOSlfqJJDHCIzQgPU0jFBLppJM/MnisFR8GXOiKFJyovydSFEo5Dj3dmR8s571c/M/rJSo4d1IaxYkiEZ4uChIGFYd5KNCngmDFxpogLKi+FeIhEggrHV1Jh2DPv7xI2vWabdXs23q5cTOLowgOwBGoAhucgQa4Ak3QAhg8gmfwCt6MJ+PFeDc+pq0FYzazD/7A+PwB+C2YwQ==</latexit><latexit sha1_base64="7cN2bmUrbR6EwtVRNDBxB+jhjzs=">AAACB3icbVDLSsNAFJ3UV62vqEtBBluhbmrSjS4LunAhUsE+oA1hMpm0QyeZMDNRSsjOjb/ixoUibv0Fd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda0ueCExamDMuuh6ShNGItBRVjHRjQVDoMdLxRhe537knQlIe3alxTJwQDSIaUIyUllzz8OGUw0o/RGqIEUuvMzfFTGbVSzf1s5OKa5atmjUBXCT2jJTBDE3X/Or7HCchiRRmSMqebcXKSZFQFDOSlfqJJDHCIzQgPU0jFBLppJM/MnisFR8GXOiKFJyovydSFEo5Dj3dmR8s571c/M/rJSo4d1IaxYkiEZ4uChIGFYd5KNCngmDFxpogLKi+FeIhEggrHV1Jh2DPv7xI2vWabdXs23q5cTOLowgOwBGoAhucgQa4Ak3QAhg8gmfwCt6MJ+PFeDc+pq0FYzazD/7A+PwB+C2YwQ==</latexit><latexit sha1_base64="7cN2bmUrbR6EwtVRNDBxB+jhjzs=">AAACB3icbVDLSsNAFJ3UV62vqEtBBluhbmrSjS4LunAhUsE+oA1hMpm0QyeZMDNRSsjOjb/ixoUibv0Fd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda0ueCExamDMuuh6ShNGItBRVjHRjQVDoMdLxRhe537knQlIe3alxTJwQDSIaUIyUllzz8OGUw0o/RGqIEUuvMzfFTGbVSzf1s5OKa5atmjUBXCT2jJTBDE3X/Or7HCchiRRmSMqebcXKSZFQFDOSlfqJJDHCIzQgPU0jFBLppJM/MnisFR8GXOiKFJyovydSFEo5Dj3dmR8s571c/M/rJSo4d1IaxYkiEZ4uChIGFYd5KNCngmDFxpogLKi+FeIhEggrHV1Jh2DPv7xI2vWabdXs23q5cTOLowgOwBGoAhucgQa4Ak3QAhg8gmfwCt6MJ+PFeDc+pq0FYzazD/7A+PwB+C2YwQ==</latexit>

w/ Lcls(Dd)<latexit sha1_base64="qMmDVgIRsezkgIxOcwL3CdhYwWQ=">AAACBnicbVDLSsNAFJ3UV62vqEsRBluhbmrSjS4LunAhUsE+oA1hMpm0QyeTMDNRSsjKjb/ixoUibv0Gd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda8soEZi0cMQi0fWQJIxy0lJUMdKNBUGhx0jHG13kfueeCEkjfqfGMXFCNOA0oBgpLbnm4cMprPRDpIYYsfQ6c1PMZFa9dFM/O6m4ZtmqWRPARWLPSBnM0HTNr74f4SQkXGGGpOzZVqycFAlFMSNZqZ9IEiM8QgPS05SjkEgnnbyRwWOt+DCIhC6u4ET9PZGiUMpx6OnO/GA57+Xif14vUcG5k1IeJ4pwPF0UJAyqCOaZQJ8KghUba4KwoPpWiIdIIKx0ciUdgj3/8iJp12u2VbNv6+XGzSyOIjgAR6AKbHAGGuAKNEELYPAInsEreDOejBfj3fiYthaM2cw++APj8wceW5hI</latexit><latexit sha1_base64="qMmDVgIRsezkgIxOcwL3CdhYwWQ=">AAACBnicbVDLSsNAFJ3UV62vqEsRBluhbmrSjS4LunAhUsE+oA1hMpm0QyeTMDNRSsjKjb/ixoUibv0Gd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda8soEZi0cMQi0fWQJIxy0lJUMdKNBUGhx0jHG13kfueeCEkjfqfGMXFCNOA0oBgpLbnm4cMprPRDpIYYsfQ6c1PMZFa9dFM/O6m4ZtmqWRPARWLPSBnM0HTNr74f4SQkXGGGpOzZVqycFAlFMSNZqZ9IEiM8QgPS05SjkEgnnbyRwWOt+DCIhC6u4ET9PZGiUMpx6OnO/GA57+Xif14vUcG5k1IeJ4pwPF0UJAyqCOaZQJ8KghUba4KwoPpWiIdIIKx0ciUdgj3/8iJp12u2VbNv6+XGzSyOIjgAR6AKbHAGGuAKNEELYPAInsEreDOejBfj3fiYthaM2cw++APj8wceW5hI</latexit><latexit sha1_base64="qMmDVgIRsezkgIxOcwL3CdhYwWQ=">AAACBnicbVDLSsNAFJ3UV62vqEsRBluhbmrSjS4LunAhUsE+oA1hMpm0QyeTMDNRSsjKjb/ixoUibv0Gd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda8soEZi0cMQi0fWQJIxy0lJUMdKNBUGhx0jHG13kfueeCEkjfqfGMXFCNOA0oBgpLbnm4cMprPRDpIYYsfQ6c1PMZFa9dFM/O6m4ZtmqWRPARWLPSBnM0HTNr74f4SQkXGGGpOzZVqycFAlFMSNZqZ9IEiM8QgPS05SjkEgnnbyRwWOt+DCIhC6u4ET9PZGiUMpx6OnO/GA57+Xif14vUcG5k1IeJ4pwPF0UJAyqCOaZQJ8KghUba4KwoPpWiIdIIKx0ciUdgj3/8iJp12u2VbNv6+XGzSyOIjgAR6AKbHAGGuAKNEELYPAInsEreDOejBfj3fiYthaM2cw++APj8wceW5hI</latexit><latexit sha1_base64="qMmDVgIRsezkgIxOcwL3CdhYwWQ=">AAACBnicbVDLSsNAFJ3UV62vqEsRBluhbmrSjS4LunAhUsE+oA1hMpm0QyeTMDNRSsjKjb/ixoUibv0Gd/6Nk7YLbT1w4XDOvdx7jxczKpVlfRuFpeWV1bXiemljc2t7x9zda8soEZi0cMQi0fWQJIxy0lJUMdKNBUGhx0jHG13kfueeCEkjfqfGMXFCNOA0oBgpLbnm4cMprPRDpIYYsfQ6c1PMZFa9dFM/O6m4ZtmqWRPARWLPSBnM0HTNr74f4SQkXGGGpOzZVqycFAlFMSNZqZ9IEiM8QgPS05SjkEgnnbyRwWOt+DCIhC6u4ET9PZGiUMpx6OnO/GA57+Xif14vUcG5k1IeJ4pwPF0UJAyqCOaZQJ8KghUba4KwoPpWiIdIIKx0ciUdgj3/8iJp12u2VbNv6+XGzSyOIjgAR6AKbHAGGuAKNEELYPAInsEreDOejBfj3fiYthaM2cw++APj8wceW5hI</latexit>

Lcross�domain 1ReshapeGAN

<latexit sha1_base64="jouWr3v4HmJR7eotafCKgu5RhnI=">AAACE3icbVDLSgMxFM34rPVVdelmsAgiWGZE0GXFhQoiVWwV2jrcSW9taCYZkoxQhvkHN/6KGxeKuHXjzr8xfSy0eiBwOOdebs4JY8608bwvZ2JyanpmNjeXn19YXFourKzWtEwUxSqVXKqbEDRyJrBqmOF4EyuEKOR4HXaP+v71PSrNpLgyvRibEdwJ1mYUjJWCwnYjAtOhwNOzLEgvUXcgxuPD8+w2pUpqvdOSETDRCPwsKBS9kjeA+5f4I1IkI1SCwmejJWkSoTCUg9Z134tNMwVlGOWY5RuJxhhoF+6wbqmACHUzHWTK3E2rtNy2VPYJ4w7UnxspRFr3otBO9hPoca8v/ufVE9M+aKZMxIlBQYeH2gl3jXT7BbktppAa3rMEqGL2ry7tgAJqbI15W4I/Hvkvqe2WfK/kX+wVy6ejOnJknWyQLeKTfVImJ6RCqoSSB/JEXsir8+g8O2/O+3B0whntrJFfcD6+AfaXntY=</latexit><latexit sha1_base64="jouWr3v4HmJR7eotafCKgu5RhnI=">AAACE3icbVDLSgMxFM34rPVVdelmsAgiWGZE0GXFhQoiVWwV2jrcSW9taCYZkoxQhvkHN/6KGxeKuHXjzr8xfSy0eiBwOOdebs4JY8608bwvZ2JyanpmNjeXn19YXFourKzWtEwUxSqVXKqbEDRyJrBqmOF4EyuEKOR4HXaP+v71PSrNpLgyvRibEdwJ1mYUjJWCwnYjAtOhwNOzLEgvUXcgxuPD8+w2pUpqvdOSETDRCPwsKBS9kjeA+5f4I1IkI1SCwmejJWkSoTCUg9Z134tNMwVlGOWY5RuJxhhoF+6wbqmACHUzHWTK3E2rtNy2VPYJ4w7UnxspRFr3otBO9hPoca8v/ufVE9M+aKZMxIlBQYeH2gl3jXT7BbktppAa3rMEqGL2ry7tgAJqbI15W4I/Hvkvqe2WfK/kX+wVy6ejOnJknWyQLeKTfVImJ6RCqoSSB/JEXsir8+g8O2/O+3B0whntrJFfcD6+AfaXntY=</latexit><latexit sha1_base64="jouWr3v4HmJR7eotafCKgu5RhnI=">AAACE3icbVDLSgMxFM34rPVVdelmsAgiWGZE0GXFhQoiVWwV2jrcSW9taCYZkoxQhvkHN/6KGxeKuHXjzr8xfSy0eiBwOOdebs4JY8608bwvZ2JyanpmNjeXn19YXFourKzWtEwUxSqVXKqbEDRyJrBqmOF4EyuEKOR4HXaP+v71PSrNpLgyvRibEdwJ1mYUjJWCwnYjAtOhwNOzLEgvUXcgxuPD8+w2pUpqvdOSETDRCPwsKBS9kjeA+5f4I1IkI1SCwmejJWkSoTCUg9Z134tNMwVlGOWY5RuJxhhoF+6wbqmACHUzHWTK3E2rtNy2VPYJ4w7UnxspRFr3otBO9hPoca8v/ufVE9M+aKZMxIlBQYeH2gl3jXT7BbktppAa3rMEqGL2ry7tgAJqbI15W4I/Hvkvqe2WfK/kX+wVy6ejOnJknWyQLeKTfVImJ6RCqoSSB/JEXsir8+g8O2/O+3B0whntrJFfcD6+AfaXntY=</latexit><latexit sha1_base64="jouWr3v4HmJR7eotafCKgu5RhnI=">AAACE3icbVDLSgMxFM34rPVVdelmsAgiWGZE0GXFhQoiVWwV2jrcSW9taCYZkoxQhvkHN/6KGxeKuHXjzr8xfSy0eiBwOOdebs4JY8608bwvZ2JyanpmNjeXn19YXFourKzWtEwUxSqVXKqbEDRyJrBqmOF4EyuEKOR4HXaP+v71PSrNpLgyvRibEdwJ1mYUjJWCwnYjAtOhwNOzLEgvUXcgxuPD8+w2pUpqvdOSETDRCPwsKBS9kjeA+5f4I1IkI1SCwmejJWkSoTCUg9Z134tNMwVlGOWY5RuJxhhoF+6wbqmACHUzHWTK3E2rtNy2VPYJ4w7UnxspRFr3otBO9hPoca8v/ufVE9M+aKZMxIlBQYeH2gl3jXT7BbktppAa3rMEqGL2ry7tgAJqbI15W4I/Hvkvqe2WfK/kX+wVy6ejOnJknWyQLeKTfVImJ6RCqoSSB/JEXsir8+g8O2/O+3B0whntrJFfcD6+AfaXntY=</latexit>

Lcross�domain 2ReshapeGAN

<latexit sha1_base64="Nw6Ul8qRPEqkvv5mAv+kcD9wOJg=">AAACE3icbVBNSwMxFMzWr1q/qh69LBZBBMtuEfSoeFBBpIq1QluXt+lrG8wmS5IVyrL/wYt/xYsHRbx68ea/Ma09qHUgMMy8x8tMGHOmjed9OrmJyanpmfxsYW5+YXGpuLxypWWiKNao5FJdh6CRM4E1wwzH61ghRCHHenh7OPDrd6g0k+LS9GNsRdAVrMMoGCsFxa1mBKZHgaenWZBeoO5BjEcHZ9lNSpXUerstI2CiGVSyoFjyyt4Q7jjxR6RERqgGxY9mW9IkQmEoB60bvhebVgrKMMoxKzQTjTHQW+hiw1IBEepWOsyUuRtWabsdqewTxh2qPzdSiLTuR6GdHCTQf72B+J/XSExnr5UyEScGBf0+1Em4a6Q7KMhtM4XU8L4lQBWzf3VpDxRQY2ss2BL8v5HHyVWl7Htl/3yntH8yqiNP1sg62SQ+2SX75JhUSY1Qck8eyTN5cR6cJ+fVefsezTmjnVXyC877F/gcntc=</latexit><latexit sha1_base64="Nw6Ul8qRPEqkvv5mAv+kcD9wOJg=">AAACE3icbVBNSwMxFMzWr1q/qh69LBZBBMtuEfSoeFBBpIq1QluXt+lrG8wmS5IVyrL/wYt/xYsHRbx68ea/Ma09qHUgMMy8x8tMGHOmjed9OrmJyanpmfxsYW5+YXGpuLxypWWiKNao5FJdh6CRM4E1wwzH61ghRCHHenh7OPDrd6g0k+LS9GNsRdAVrMMoGCsFxa1mBKZHgaenWZBeoO5BjEcHZ9lNSpXUerstI2CiGVSyoFjyyt4Q7jjxR6RERqgGxY9mW9IkQmEoB60bvhebVgrKMMoxKzQTjTHQW+hiw1IBEepWOsyUuRtWabsdqewTxh2qPzdSiLTuR6GdHCTQf72B+J/XSExnr5UyEScGBf0+1Em4a6Q7KMhtM4XU8L4lQBWzf3VpDxRQY2ss2BL8v5HHyVWl7Htl/3yntH8yqiNP1sg62SQ+2SX75JhUSY1Qck8eyTN5cR6cJ+fVefsezTmjnVXyC877F/gcntc=</latexit><latexit sha1_base64="Nw6Ul8qRPEqkvv5mAv+kcD9wOJg=">AAACE3icbVBNSwMxFMzWr1q/qh69LBZBBMtuEfSoeFBBpIq1QluXt+lrG8wmS5IVyrL/wYt/xYsHRbx68ea/Ma09qHUgMMy8x8tMGHOmjed9OrmJyanpmfxsYW5+YXGpuLxypWWiKNao5FJdh6CRM4E1wwzH61ghRCHHenh7OPDrd6g0k+LS9GNsRdAVrMMoGCsFxa1mBKZHgaenWZBeoO5BjEcHZ9lNSpXUerstI2CiGVSyoFjyyt4Q7jjxR6RERqgGxY9mW9IkQmEoB60bvhebVgrKMMoxKzQTjTHQW+hiw1IBEepWOsyUuRtWabsdqewTxh2qPzdSiLTuR6GdHCTQf72B+J/XSExnr5UyEScGBf0+1Em4a6Q7KMhtM4XU8L4lQBWzf3VpDxRQY2ss2BL8v5HHyVWl7Htl/3yntH8yqiNP1sg62SQ+2SX75JhUSY1Qck8eyTN5cR6cJ+fVefsezTmjnVXyC877F/gcntc=</latexit><latexit sha1_base64="Nw6Ul8qRPEqkvv5mAv+kcD9wOJg=">AAACE3icbVBNSwMxFMzWr1q/qh69LBZBBMtuEfSoeFBBpIq1QluXt+lrG8wmS5IVyrL/wYt/xYsHRbx68ea/Ma09qHUgMMy8x8tMGHOmjed9OrmJyanpmfxsYW5+YXGpuLxypWWiKNao5FJdh6CRM4E1wwzH61ghRCHHenh7OPDrd6g0k+LS9GNsRdAVrMMoGCsFxa1mBKZHgaenWZBeoO5BjEcHZ9lNSpXUerstI2CiGVSyoFjyyt4Q7jjxR6RERqgGxY9mW9IkQmEoB60bvhebVgrKMMoxKzQTjTHQW+hiw1IBEepWOsyUuRtWabsdqewTxh2qPzdSiLTuR6GdHCTQf72B+J/XSExnr5UyEScGBf0+1Em4a6Q7KMhtM4XU8L4lQBWzf3VpDxRQY2ss2BL8v5HHyVWl7Htl/3yntH8yqiNP1sg62SQ+2SX75JhUSY1Qck8eyTN5cR6cJ+fVefsezTmjnVXyC877F/gcntc=</latexit>

Figure 21: Visual comparison results generated by ourReshapeGAN of different cases without or with Lcls(Dd)(a), of different cases with Lcross−domain 1

ReshapeGAN (G,D,Dd) orwith Lcross−domain 2

ReshapeGAN (G,D,Dd) (b) on selected 5 facialdatasets.

20

Page 21: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

Figure 22: The visualization results of of FID distance using different methods on four datasets for caricature reshap-ing.

mized by Lcross−domain 1ReshapeGAN (G,D,Dd) and reference im-

ages (Stage1 for abbreviation), 3) between synthesizedimages optimized byLcross−domain 2

ReshapeGAN (G,D,Dd) and ref-erence images (ReshapeGAN for abbreviation), 4) be-tween source input images and reference images (Real forabbreviation). We visualize all the FID results in Fig. 22.Each sub figure shows the translation from a source do-main to another 4 target domains. We see that the FIDscore of the model without domain classification loss ishigher than the model with domain classification loss (Re-shapeGAN). So the domain classification loss does helpsto improve the image generation quality. Besides, if wesplit a difficult cross-domain object reshaping task to twoeasier tasks, we can get better generation performance.Almost all the FID scores of proposed model are lowerthan those of other models using only one stage. That is,the second refining stage does improve the image gener-ation quality. More ablation study can be found in Sec-tion 4.5. With geometric information from target domain,we see that the FID between generated images and refer-ence images is lower than FID between source input im-ages and reference images. Thus our ReshapeGAN canreduce domain distances by combining the geometric in-

formation.

4.5 Ablation study

In order to show the effectiveness of our pipeline archi-tecture, we design another ablation study with one-stagegeneration architecture by combining domain classifica-tion loss Lcls(Dd) and geometric information to a strongconditional input. Fig. 21(a) shows the visual comparisonresults, from which, we can see that that the one-stagegeneration method without Lcls(Dd) can not preserve theidentity and domain information well, and the generationis not sharp while the boundary is not clear. And withLcls(Dd), the model generates images without dirty colorblocks. In addition, we also explore the effectiveness ofthe second refining stage, the visual comparison is exhib-ited in Fig. 21(b). As shown, the generated images withrefining stage have better geometric changes according togiven guidance.

21

Page 22: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

5 Failure and limitationIn this section, we will discuss the limitation of our Re-shapeGAN. Our model requires extra geometric informa-tion as an input, e.g., we use dlib [26] to obtain face land-mark as geometric information. So the face landmarkshould be exist and accurate, and sometimes we have toabandon the face images that dlib cannot extract land-marks (e.g., the method fails to obtain geometric infor-mation for the full left and right images on FEI dataset).Obviously, it will affect the performance of our model ifthe geometric information is wrong. For using the unsu-pervised manner in our experiments to achieve the cross-dataset image reshaping, we still lack prior knowledgewhen the semantic domain gap is huge between differ-ent image domains. Fig. 23 shows some failure cases ofReshapeGAN, where our model fails to preserve the ap-pearance of input images well, and the outputs look likejust the simple combination of source image and targetimage. Visually, our ReshapeGAN fails to reshape the in-put image to the reference geometric shape if the distancebetween the inputs and the references is too large, sinceour model has insufficient corresponding prior informa-tion to achieve reasonable translation. In this case, wewill try to disentangle the content representation to makethe content information preserved, and we leave this asour future work.

Input Ref. Output

Figure 23: The failure cases on object reshaping by ourReshapeGAN across multiple datasets.

6 ConclusionIn this paper, we proposed a new architecture that can gen-erate domain-specific or cross-domain images with geo-metric guidance, which is the first general framework fora wide range of object reshaping by providing a singlereference. Comprehensive experiments on both ablationstudy and comparisons with comparable state-of-the-artmodels, in terms of all kinds of applicable quantitative andqualitative as well as human subjective evaluation metrics,show that our model performs better for identity preservedobject reshaping.

AcknowledgmentThis work was supported in part by the Royal Societyunder IEC\R3\ 170013 - International Exchanges 2017Cost Share (Japan and Taiwan only), in part by a Mi-crosoft Research Asia (MSRA) Collaborative Research2019 Grant, and in part by the National Natural Sci-ence Foundation of China under grants 61771440 and41776113.

22

Page 23: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

References[1] Antipov, G., Baccouche, M., Dugelay, J.L.: Face aging with

conditional generative adversarial networks. arXiv preprintarXiv:1702.01983 (2017)

[2] Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generativeadversarial networks. In: International Conference on MachineLearning, pp. 214–223 (2017)

[3] Bau, D., Zhu, J.Y., Strobelt, H., Zhou, B., Tenenbaum, J.B., Free-man, W.T., Torralba, A.: GAN dissection: Visualizing and under-standing generative adversarial networks. In: International Con-ference on Learning Representation (2019)

[4] Borji, A.: Pros and cons of gan evaluation measures. ComputerVision and Image Understanding 179, 41–65 (2019)

[5] Calvo, M.G., Lundqvist, D.: Facial expressions of emotion (kdef):Identification under different display-duration conditions. Behav-ior Research Methods 40(1), 109–115 (2008)

[6] Cao, Z., Simon, T., Wei, S.E., Sheikh, Y.: Realtime multi-person2d pose estimation using part affinity fields. In: IEEE Confer-ence on Computer Vision and Pattern Recognition, pp. 7291–7299(2017)

[7] Chen, Q., Koltun, V.: Photographic image synthesis with cascadedrefinement networks. In: IEEE International Conference on Com-puter Vision, pp. 1511–1520 (2017)

[8] Chen, X., Duan, Y., Houthooft, R., Schulman, J., Sutskever, I.,Abbeel, P.: InfoGAN: Interpretable representation learning by in-formation maximizing generative adversarial nets. In: Advancesin Neural Information Processing Systems, pp. 2172–2180 (2016)

[9] Chen, Z., Tong, Y.: Face super-resolution through WassersteinGANs. arXiv preprint arXiv:1705.02438 (2017)

[10] Choi, Y., Choi, M., Kim, M., Ha, J.W., Kim, S., Choo, J.: Star-GAN: Unified generative adversarial networks for multi-domainimage-to-image translation. In: IEEE Conference on ComputerVision and Pattern Recognition, pp. 8789–8797 (2018)

[11] Creswell, A., White, T., Dumoulin, V., Arulkumaran, K., Sen-gupta, B., Bharath, A.A.: Generative adversarial networks: Anoverview. IEEE Signal Processing Magazine 35(1), 53–65 (2018)

[12] Denton, E.L., Chintala, S., Fergus, R., et al.: Deep generative im-age models using a laplacian pyramid of adversarial networks. In:Advances in Neural Information Processing Systems, pp. 1486–1494 (2015)

[13] Devon Hjelm, R., Jacob, A.P., Che, T., Trischler, A., Cho, K., Ben-gio, Y.: Boundary-seeking generative adversarial networks. arXivpreprint arXiv:1702.08431 (2017)

[14] Donahue, J., Krahenbuhl, P., Darrell, T.: Adversarial feature learn-ing. arXiv preprint arXiv:1605.09782 (2016)

[15] Dosovitskiy, A., Brox, T.: Generating images with perceptual sim-ilarity metrics based on deep networks. In: Advances in NeuralInformation Processing Systems, pp. 658–666 (2016)

[16] Dumoulin, V., Belghazi, I., Poole, B., Mastropietro, O., Lamb,A., Arjovsky, M., Courville, A.: Adversarially learned inference.arXiv preprint arXiv:1606.00704 (2016)

[17] Gatys, L.A., Ecker, A.S., Bethge, M.: A neural algorithm of artis-tic style. arXiv preprint arXiv:1508.06576 (2015)

[18] Georghiades, A., Belhumeur, P., Kriegman, D.: Yale facedatabase. Center for computational Vision and Control at YaleUniversity, http://cvc.yale.edu/projects/yalefaces/yalefa 2 (1997)

[19] Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adver-sarial nets. In: Advances in Neural Information Processing Sys-tems, pp. 2672–2680 (2014)

[20] Gulrajani, I., Ahmed, F., Arjovsky, M., Dumoulin, V., Courville,A.C.: Improved training of Wasserstein GANs. In: Advances inNeural Information Processing Systems, pp. 5767–5777 (2017)

[21] Hore, A., Ziou, D.: Image quality metrics: Psnr vs. ssim. In:International Conference on Pattern Recognition, pp. 2366–2369.IEEE (2010)

[22] Huang, H., Yu, P.S., Wang, C.: An introduction to image synthesiswith generative adversarial nets. arXiv preprint arXiv:1803.04469(2018)

[23] Huang, X., Liu, M.Y., Belongie, S., Kautz, J.: Multimodal unsu-pervised image-to-image translation. In: European Conference onComputer Vision, pp. 172–189 (2018)

[24] Isola, P., Zhu, J.Y., Zhou, T., Efros, A.A.: Image-to-image trans-lation with conditional adversarial networks. In: IEEE confer-ence on Computer Vision and Pattern Recognition, pp. 1125–1134(2017)

[25] Kim, T., Cha, M., Kim, H., Lee, J.K., Kim, J.: Learning to discovercross-domain relations with generative adversarial networks. In:International Conference on Machine Learning, pp. 1857–1865.JMLR. org (2017)

[26] King, D.E.: Dlib-ml: A machine learning toolkit. Journal of Ma-chine Learning Research 10(Jul), 1755–1758 (2009)

[27] Kingma, D.P., Welling, M.: Auto-encoding variational Bayes.arXiv preprint arXiv:1312.6114 (2013)

[28] Kodali, N., Abernethy, J., Hays, J., Kira, Z.: On convergence andstability of gans. arXiv preprint arXiv:1705.07215 (2017)

[29] Kurach, K., Lucic, M., Zhai, X., Michalski, M., Gelly, S.: TheGAN landscape: Losses, architectures, regularization, and normal-ization. arXiv preprint arXiv:1807.04720 (2018)

[30] Lample, G., Zeghidour, N., Usunier, N., Bordes, A., Denoyer,L., et al.: Fader networks: Manipulating images by sliding at-tributes. In: Advances in Neural Information Processing Systems,pp. 5967–5976 (2017)

[31] Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D., Hawk, S.,van Knippenberg, A.: Presentation and validation of the radboudfaces database. Cognition and Emotion 24, 1377–1388 (2010)

[32] Larsen, A.B.L., Sønderby, S.K., Larochelle, H., Winther, O.: Au-toencoding beyond pixels using a learned similarity metric. In:International Conference on Machine Learning, pp. 1558–1566(2016)

[33] Lassner, C., Pons-Moll, G., Gehler, P.V.: A generative model ofpeople in clothing. In: IEEE International Conference on Com-puter Vision, pp. 853–862 (2017)

23

Page 24: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

[34] Ledig, C., Theis, L., Huszar, F., Caballero, J., Cunningham, A.,Acosta, A., Aitken, A., Tejani, A., Totz, J., Wang, Z., et al.: Photo-realistic single image super-resolution using a generative adversar-ial network. In: IEEE Conference on Computer Vision and PatternRecognition, pp. 4681–4690 (2017)

[35] Lee, H.Y., Tseng, H.Y., Huang, J.B., Singh, M., Yang, M.H.: Di-verse image-to-image translation via disentangled representations.In: European Conference on Computer Vision, pp. 35–51 (2018)

[36] Li, J., Liang, X., Wei, Y., Xu, T., Feng, J., Yan, S.: Perceptualgenerative adversarial networks for small object detection. In:IEEE Conference on Computer Vision and Pattern Recognition,pp. 1222–1230 (2017)

[37] Liu, M.Y., Breuel, T., Kautz, J.: Unsupervised image-to-imagetranslation networks. In: Advances in Neural Information Pro-cessing Systems, pp. 700–708 (2017)

[38] Liu, Z., Luo, P., Wang, X., Tang, X.: Deep learning face attributesin the wild. In: IEEE International Conference on Computer Vi-sion, pp. 3730–3738 (2015)

[39] Lucic, M., Kurach, K., Michalski, M., Gelly, S., Bousquet, O.: AreGANs created equal? a large-scale study. In: Advances in NeuralInformation Processing Systems, pp. 700–709 (2018)

[40] Luo, Y., Xu, Y., Ji, H.: Removing rain from a single image viadiscriminative sparse coding. In: ICCV, pp. 3397–3405 (2015)

[41] Ma, L., Jia, X., Georgoulis, S., Tuytelaars, T., Van Gool, L.: Ex-emplar guided unsupervised image-to-image translation. In: Inter-national Conference on Learning Representation (2019)

[42] Ma, L., Jia, X., Sun, Q., Schiele, B., Tuytelaars, T., Van Gool,L.: Pose guided person image generation. In: Advances in NeuralInformation Processing Systems, pp. 406–416 (2017)

[43] Mao, X., Li, Q., Xie, H., Lau, R.Y., Wang, Z., Smolley, S.P.:Least squares generative adversarial networks. In: IEEE inter-national conference on computer vision (ICCV), pp. 2813–2821.IEEE (2017)

[44] Mirza, M., Osindero, S.: Conditional generative adversarial nets.arXiv preprint arXiv:1411.1784 (2014)

[45] Mishra, A., Rai, S.N., Mishra, A., Jawahar, C.: IIIT-CFW: Abenchmark database of cartoon faces in the wild. In: EuropeanConference on Computer Vision, pp. 35–47 (2016)

[46] Nguyen, A., Clune, J., Bengio, Y., Dosovitskiy, A., Yosinski, J.:Plug & play generative networks: Conditional iterative generationof images in latent space. In: IEEE Conference on Computer Vi-sion and Pattern Recognition, pp. 4467–4477 (2017)

[47] Nogue, F.C., Huie, A., Dasgupta, S.: Object detection using do-main randomization and generative adversarial refinement of syn-thetic images. arXiv preprint arXiv:1805.11778 (2018)

[48] Odena, A., Olah, C., Shlens, J.: Conditional image synthesis withauxiliary classifier gans. In: International Conference on MachineLearning, pp. 2642–2651. JMLR. org (2017)

[49] Olszanowski, M., Pochwatko, G., Kuklinski, K., Scibor-Rylski,M., Lewinski, P., Ohme, R.K.: Warsaw Set of Emotional Fa-cial Expression Pictures: a validation study of facial display pho-tographs. Frontiers in Psychology 5, 1516 (2015)

[50] Park, T., Liu, M.Y., Wang, T.C., Zhu, J.Y.: Semantic image syn-thesis with spatially-adaptive normalization. In: IEEE Conferenceon Computer Vision and Pattern Recognition (2019)

[51] Pathak, D., Krahenbuhl, P., Donahue, J., Darrell, T., Efros, A.A.:Context encoders: Feature learning by inpainting. In: CVPR, pp.2536–2544 (2016)

[52] Press, O., Bar, A., Bogin, B., Berant, J., Wolf, L.: Language gener-ation with recurrent generative adversarial networks without pre-training. arXiv preprint arXiv:1706.01399 (2017)

[53] Qi, G.J.: Loss-sensitive generative adversarial networks on lips-chitz densities. arXiv preprint arXiv:1701.06264 (2017)

[54] Radford, A., Metz, L., Chintala, S.: Unsupervised representationlearning with deep convolutional generative adversarial networks.arXiv preprint arXiv:1511.06434 (2015)

[55] Reed, S., Akata, Z., Yan, X., Logeswaran, L., Schiele, B., Lee,H.: Generative adversarial text to image synthesis. arXiv preprintarXiv:1605.05396 (2016)

[56] Reed, S., van den Oord, A., Kalchbrenner, N., Bapst, V.,Botvinick, M., de Freitas, N.: Generating interpretable imageswith controllable structure. In: International Conference on Learn-ing Representation Workshop (2017)

[57] Reed, S.E., Akata, Z., Mohan, S., Tenka, S., Schiele, B., Lee, H.:Learning what and where to draw. In: Advances in Neural Infor-mation Processing Systems, pp. 217–225 (2016)

[58] Shrivastava, A., Pfister, T., Tuzel, O., Susskind, J., Wang, W.,Webb, R.: Learning from simulated and unsupervised imagesthrough adversarial training. In: IEEE Conference on ComputerVision and Pattern Recognition, pp. 2107–2116 (2017)

[59] Sønderby, C.K., Caballero, J., Theis, L., Shi, W., Huszar, F.:Amortised map inference for image super-resolution. In: Inter-national Conference on Learning Representation, pp. 1–17 (2017)

[60] Taigman, Y., Polyak, A., Wolf, L.: Unsupervised cross-domainimage generation. In: International Conference on Learning Rep-resentation (2017)

[61] Thomaz, C.E., Giraldi, G.A.: A new ranking method for principalcomponents analysis and its application to face image analysis.Image and Vision Computing 28(6), 902–913 (2010)

[62] Tran, L.Q., Yin, X., Liu, X.: Representation learning by rotatingyour faces. IEEE Transactions on Pattern Analysis and MachineIntelligence (2018)

[63] Van Der Schalk, J., Hawk, S.T., Fischer, A.H., Doosje, B.: Mov-ing faces, looking places: validation of the Amsterdam DynamicFacial Expression Set (ADFES). Emotion 11(4), 907 (2011)

[64] Wang, T.C., Liu, M.Y., Zhu, J.Y., Tao, A., Kautz, J., Catanzaro, B.:High-resolution image synthesis and semantic manipulation withconditional gans. arXiv preprint arXiv:1711.11585 (2017)

[65] Wang, X., Tang, X.: Face photo-sketch synthesis and recognition.IEEE Transactions on Pattern Analysis and Machine Intelligence31(11), 1955–1967 (2008)

[66] Wang, X., Tang, X.: Face photo-sketch synthesis and recognition.IEEE Transactions on Pattern Analysis and Machine Intelligence31(11), 1955–1967 (2009)

24

Page 25: Image - arxiv.org · ReshapeGAN: Object Reshaping by Providing A Single Reference Image Ziqiang Zheng 1, Yang Wuy2, Zhibin Yu1, Yang Yang3, Haiyong Zheng1, and Takeo Kanade4 1Ocean

[67] Willmott, C.J., Matsuura, K.: Advantages of the mean absoluteerror (mae) over the root mean square error (rmse) in assessingaverage model performance. Climate research 30(1), 79–82 (2005)

[68] Yan, X., Yang, J., Sohn, K., Lee, H.: Attribute2image: Conditionalimage generation from visual attributes. In: European Conferenceon Computer Vision (2016)

[69] Yang, C., Lu, X., Lin, Z., Shechtman, E., Wang, O., Li, H.: High-resolution image inpainting using multi-scale neural patch synthe-sis. In: CVPR, pp. 6721–6729 (2017)

[70] Yang, C., Wang, Z., Zhu, X., Huang, C., Shi, J., Lin, D.: Poseguided human video generation. In: European Conference onComputer Vision, pp. 201–216 (2018)

[71] Yi, Z., Zhang, H., Tan, P., Gong, M.: DualGAN: Unsuperviseddual learning for image-to-image translation. In: IEEE Interna-tional Conference on Computer Vision, pp. 2849–2857 (2017)

[72] Zhang, H., Sindagi, V., Patel, V.M.: Image de-raining usinga conditional generative adversarial network. arXiv preprintarXiv:1701.05957 (2017)

[73] Zhang, H., Xu, T., Li, H., Zhang, S., Wang, X., Huang, S.X.,Metaxas, D.N.: StackGAN++: Realistic image synthesis withstacked generative adversarial networks. IEEE Transactions onPattern Analysis and Machine Intelligence (2018)

[74] Zhang, H., Xu, T., Li, H., Zhang, S., Wang, X., Huang, X.,Metaxas, D.N.: StackGAN: Text to photo-realistic image synthe-sis with stacked generative adversarial networks. In: IEEE Inter-national Conference on Computer Vision, pp. 5907–5915 (2017)

[75] Zhang, R., Isola, P., Efros, A.A., Shechtman, E., Wang, O.: Theunreasonable effectiveness of deep features as a perceptual metric.arXiv preprint (2018)

[76] Zhang, W., Sun, J., Tang, X.: Cat head detection-how to effectivelyexploit shape and texture features. In: European Conference onComputer Vision, pp. 802–816. Springer (2008)

[77] Zhang, Z., Song, Y., Qi, H.: Age progression/regression by condi-tional adversarial autoencoder. In: IEEE Conference on ComputerVision and Pattern Recognition, pp. 5810–5818 (2017)

[78] Zhao, B., Wu, X., Cheng, Z.Q., Liu, H., Jie, Z., Feng, J.: Multi-view image generation from a single-view. In: 2018 ACM Multi-media Conference on Multimedia Conference, pp. 383–391. ACM(2018)

[79] Zhao, J., Mathieu, M., LeCun, Y.: Energy-based generative adver-sarial network. arXiv preprint arXiv:1609.03126 (2016)

[80] Zhu, J.Y., Park, T., Isola, P., Efros, A.A.: Unpaired image-to-image translation using cycle-consistent adversarial networks. In:IEEE International Conference on Computer Vision, pp. 2223–2232 (2017)

[81] Zhu, J.Y., Zhang, R., Pathak, D., Darrell, T., Efros, A.A., Wang,O., Shechtman, E.: Toward multimodal image-to-image transla-tion. In: Advances in Neural Information Processing Systems, pp.465–476 (2017)

25