-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the training loss #22
Comments
The result images shown in the main body of the paper are generated with
WGAN. After the acceptance of the paper, I did some further experiments and
found that the original GAN loss actually help improves the quality of
output images, for the datasets used in the paper. Though I cannot conclude
that the traditional GAN loss is always better than wGAN loss in terms of
the DualGAN architecture.
If you have interests in further exploration, you may try WGAN-GP or other
losses.
…On 11 July 2018 at 16:28, Yihong Gu ***@***.***> wrote:
I read your code about the design for loss and found your implementation
is different from that proposed in the paper. So you use the traditional
loss of GAN instead of the WGAN loss? Does it mean WGAN loss might not be a
good choice in practice?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#22>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ACc6e6kdp3q45BB2Lvksqqnz_PcINmmRks5uFooggaJpZM4VL_lS>
.
|
Why is noise z not used? |
Instead of explicitly adding random noise, I used dropout instead. |
"def preprocess_img(img, img_size=128, flip=False, is_test=False): |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I read your code about the design for loss and found your implementation is different from that proposed in the paper. So you use the traditional loss of GAN instead of the WGAN loss? Does it mean WGAN loss might not be a good choice in practice?
The text was updated successfully, but these errors were encountered: