Skip to main content
Log in

An adversarial non-volume preserving flow model with Boltzmann priors

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Flow-based generative models (flow models) are conceptually attractive due to tractability of the exact log-likelihood and the exact latent-variable inference. In order to generate sharper images and extend the Gaussian prior of Flow models to other discrete forms, we propose an adversarial non-volume preserving flow model with Boltzmann priors (ANVP) for modeling complex high-dimensional densities. In order to generate sharper images, an ANVP model introduces an adversarial regularizer into the loss function to penalize the condition that it places a high probability in regions where the training data distribution has a low density. Moreover, we show that the Gaussian prior can be extended to other forms such as the Boltzmann prior in the proposed ANVP model, and we use multi-scale transformations and Boltzmann priors to model the data distribution. The experiments show that proposed model is effective in image generation task.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Dinh L, Krueger D, Bengio Y (2015) NICE: Non-linear independent components estimation. In: International conference on learning representations

  2. Kingma D, Welling M (2013) Auto-encoding variational bayes. In: International conference on learning representations

  3. Goodfellow I, Pouget-Abadie J, Mirza M et al (2014) Generative adversarial nets. In: International conference on neural information processing systems

  4. Gulrajani I, Ahmed F, Arjovsky M et al (2017) Improved training of wasserstein GANs. In: International conference on neural information processing systems

  5. Metz L, Poole B, Pfau D et al (2017) Unrolled generative adversarial networks. In: International conference on learning representations

  6. Nair V, Hinton G (2010) Rectified linear units improve restricted boltzmann machines. In: International conference on machine learning

  7. Dinh L, Sohl-Dickstein J, Bengio S (2017) Density estimation using Real NVP. In: International conference on learning representations

  8. Kingma D, Dhariwal P (2018) Glow: generative flow with invertible 1 × 1 convolutions. In: International conference on neural information processing systems

  9. Zhang N (2019) Point-wise gated restricted Boltzmann machines using clean data. Int J Collab Intell 2(1):75–82

    Google Scholar 

  10. Grégoire M, Klaus-Robert M, Marco C (2016) Wasserstein training of restricted Boltzmann machines. In: International conference on neural information processing systems

  11. Fisher C, Smith A, Walsh J (2018) Boltzmann encoded adversarial machines. arXiv 1804.086822018

  12. Zhang N, Ding S, Zhang J et al (2017) An overview on restricted Boltzmann machines. Neurocomputing 275:1186–1199

    Article  Google Scholar 

  13. Hinton G (2012) A practical guide to training restricted Boltzmann machines, neural networks: tricks of the trade. Springer, Berlin Heidelberg, pp 599–619

    Book  Google Scholar 

  14. Mescheder L, Nowozin S, Geiger A (2017) Adversarial variational bayes: unifying variational autoencoders and generative adversarial networks. In: International conference on neural information processing systems

  15. Du M, Ding S, Xu X et al (2018) Density peaks clustering using geodesic distances. Int J Mach Learn Cybernet 9(8):1335–1349

    Article  Google Scholar 

  16. Li M (2019) NNGDPC: a kNNG-based density peaks clustering. Int J Collab Intell 2(1):1–15

    Google Scholar 

  17. He K, Zhang X, Ren S et al (2016) Deep Residual learning for image recognition. In: IEEE conference on computer vision and pattern recognition

Download references

Acknowledgements

This work is supported by Outstanding Innovation Scholarship for Doctoral Candidate of CUMT. No. 2019YCBS058.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shifei Ding.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, J., Ding, S. & Jia, W. An adversarial non-volume preserving flow model with Boltzmann priors. Int. J. Mach. Learn. & Cyber. 11, 913–921 (2020). https://doi.org/10.1007/s13042-019-01048-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-019-01048-8

Keywords

Navigation