Loading web-font TeX/Math/Italic
Asymmetric Training in RealnessGAN | IEEE Journals & Magazine | IEEE Xplore

Asymmetric Training in RealnessGAN


Abstract:

Generative adversarial networks (GANs) have demonstrated superior performances in image generation. In recent years, various improvements of network structure and learnin...Show More

Abstract:

Generative adversarial networks (GANs) have demonstrated superior performances in image generation. In recent years, various improvements of network structure and learning theory related to GANs have undergone numerous advancement. Among these improvement techniques, the asymmetric training on the generator and discriminator networks has been widely adopted. For example, the batch normalization is used in generator while the spectral normalization is used in discriminator, or using different learning rates for the generator and discriminator. However, the asymmetric training on the real and generated samples has not been taken into consideration till now. In this paper, we proposed a novel asymmetric training-based RealnessGAN (ATRGAN) which applies the idea of asymmetric training on both samples and networks. Specifically, the asymmetric training on samples refers to performing the differential learning on the real and generated samples by controlling the information entropies of real and fake anchor distributions. The asymmetric training on networks is realized via the sampling transmission G2D, which abandons the commonly used independent random sampling. With the help of G2D, the discriminator can obtain a dominant training position than the generator, so as to ensure that the discriminator can guide the generator more effectively during training. In addition, we proposed the floating anchor distribution technique and constructed the objective function of generator for ATRGAN. Through comparative experiments, we demonstrated ATRGAN's ability of achieving better generation performance than various SOTA GANs on CIFAR-10, CAT, and CelebA-HQ datasets.
Published in: IEEE Transactions on Multimedia ( Volume: 25)
Page(s): 8157 - 8169
Date of Publication: 30 December 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.