Skip to main content
Log in

Cascade conditional generative adversarial nets for spatial-spectral hyperspectral sample generation

  • Research Paper
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

Sample generation is an effective way to solve the problem of the insufficiency of training data for hyperspectral image classification. The generative adversarial network (GAN) is one of the popular deep learning methods, which utilizes adversarial training to generate the region of samples based on the required class label. In this paper, we propose cascade conditional generative adversarial nets for hyperspectral image complete spatial-spectral sample generation, named C2GAN. The C2GAN includes two stages. The stage-one model consists of the spatial information generation with a window size that entails feeding random noise and the required class label. The second stage is the spatial-spectral information generation that generates spectral information of all bands in the spatial region by feeding the label regions. The visualization and verification of generated samples based on the Pavia University and Salinas datasets show superior performance, which demonstrates that our method is useful for hyperspectral image classification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ghamisi P, Maggiori E, Li S T, et al. New frontiers in spatial-spectral hyperspectral image classification: the latest advances based on mathematical morphology, Markov random fields, segmentation, sparse representation, and deep learning. IEEE Geosci Remote Sens Mag, 2018, 6: 10–43

    Article  Google Scholar 

  2. Li S T, Song W W, Fang L Y, et al. Deep learning for hyperspectral image classification: an overview. IEEE Trans Geosci Remote Sens, 2019, 57: 6690–6709

    Article  Google Scholar 

  3. Chen Y S, Jiang H L, Li C Y, et al. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Trans Geosci Remote Sens, 2016, 54: 6232–6251

    Article  Google Scholar 

  4. Xu X D, Li W, Ran Q, et al. Multisource remote sensing data classification based on convolutional neural network. IEEE Trans Geosci Remote Sens, 2018, 56: 937–949

    Article  Google Scholar 

  5. Goodfellow I J, Pouget-Abadie J, Mirza M, et al. Generative adversarial nets. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, 2014. 2672–2680

  6. Mirza M, Osindero S. Conditional generative adversarial nets. 2014. ArXiv: 1411.1784

  7. Zhu L, Chen Y S, Ghamisi P, et al. Generative adversarial networks for hyperspectral image classification. IEEE Trans Geosci Remote Sens, 2018, 56: 5046–5063

    Article  Google Scholar 

  8. Zhan Y, Hu D, Wang Y T, et al. Semisupervised hyperspectral image classification based on generative adversarial networks. IEEE Geosci Remote Sens Lett, 2018, 15: 212–216

    Article  Google Scholar 

  9. Feng J, Yu H P, Wang L, et al. Classification of hyperspectral images based on multiclass spatial-spectral generative adversarial networks. IEEE Trans Geosci Remote Sens, 2019, 57: 5329–5343

    Article  Google Scholar 

  10. Xu Y H, Du B, Zhang L. Can we generate good samples for hyperspectral classification?—A generative adversarial network based method. In: Proceedings of IEEE International Geoscience and Remote Sensing Symposium, 2018. 5752–5755

  11. Wang X, Tan K, Du Q, et al. Caps-TripleGAN: GAN-assisted CapsNet for hyperspectral image classification. IEEE Trans Geosci Remote Sens, 2019, 57: 7232–7245

    Article  Google Scholar 

  12. Lei N, Guo Y, An D, et al. Mode collapse and regularity of optimal transportation Maps. 2019. ArXiv: 1902.02934

  13. Arjovsky M, Chintala S, Bottou L. Wasserstein gan. 2017. ArXiv: 1701.07875

  14. Miyato T, Kataoka T, Koyama M, et al. Spectral normalization for generative adversarial networks. 2018. ArXiv: 1802.05957

  15. Yinka-Banjo C, Ugot O A. A review of generative adversarial networks and its application in cybersecurity. Artif Intell Rev, 2019. doi: https://doi.org/10.1007/s10462-019-09717-4

  16. Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks. Commun ACM, 2017, 60: 84–90

    Article  Google Scholar 

  17. Isola P, Zhu J Y, Zhou T, et al. Image-to-image translation with conditional adversarial networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017. 1125–1134

  18. Kruse F A, Lefkoff A B, Boardman J W, et al. The spectral image processing system (SIPS)-interactive visualization and analysis of imaging spectrometer data. In: Proceedings of AIP Conference Proceedings, 1993. 192–201

  19. He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition. In: Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. 770–778

Download references

Acknowledgements

This work was supported by National Nature Science Foundation of China (Grant Nos. 61973285, 61873249, 61773355, 61603355), National Nature Science Foundation of Hubei Province (Grant No. 2018CFB528), Opening Fund of the Ministry of Education Key Laboratory of Geological Survey and Evaluation (Grant No. CUG2019ZR10), and Fundamental Research Funds for the Central Universities (Grant No. CUGL17022).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yulin Qiao.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, X., Qiao, Y., Xiong, Y. et al. Cascade conditional generative adversarial nets for spatial-spectral hyperspectral sample generation. Sci. China Inf. Sci. 63, 140306 (2020). https://doi.org/10.1007/s11432-019-2798-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11432-019-2798-9

Keywords

Navigation