Skip to main content

A Cross-Paired Wavelet Based Spatiotemporal Fusion Network for Remote Sensing Images

  • Conference paper
  • First Online:
Image and Graphics (ICIG 2023)

Abstract

Spatiotemporal fusion is an effective way to provide remote sensing images with both high temporal resolution and high spatial resolution for earth observation. Most of the existing methods require at least three images as input, which may increase the difficulty in practical applications. Towards this end, a cross-paired wavelet based spatiotemporal fusion network (CPW-STFN) for remote sensing images is proposed. The wavelet transform decomposes the low and high frequency components of the image into four channels, so that it enables the model to train features of different level separately. The proposed CPW-STFN can extract the detail textures as well as the global information better and easier. In other words, we achieved a spatiotemporal fusion method with only two cross-paired images as inputs which are the fine resolution image at reference date and the coarse resolution image at prediction date. In addition, a compound loss function containing a wavelet loss to promote the spatial detail preservation is proposed. In this paper, the fusion ability of the proposed CPW-STFN was tested by the commonly used datasets CIA and LGC, and compared with other methods including state-of-the-art models STARFM, FSDAF, EDCSTFN, MLFF-GAN and GAN-STFM. CPW-STFN performs better than GAN-STFM which also requires only two input images, and not inferior to the other methods which require at least three inputs, proving its advantage and potential.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Li, J., Li, Y., He, L., Chen, J., Plaza, A.: Spatio-temporal fusion for remote sensing data: an overview and new benchmark. Sci. China Inf. Sci. 63(4), 7–23 (2020)

    Article  MathSciNet  Google Scholar 

  2. Tan, Z., Di, L., Zhang, M., Guo, L., Gao, M.: An enhanced deep convolutional model for spatiotemporal image fusion. Remote Sensing 11(24), 2898 (2019)

    Article  Google Scholar 

  3. Chen, B., Huang, B., Xu, B.: Comparison of spatiotemporal fusion models: a review. Remote Sensing 7(2), 1798–1835 (2015)

    Article  Google Scholar 

  4. Gao, F., Masek, J., Schwaller, M., Hall, F.: On the blending of the landsat and MODIS surface reflectance: predicting daily Landsat surface reflectance. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 44(8), 2207–2218 (2006)

    Article  Google Scholar 

  5. Zhu, X., Chen, J., Gao, F., Chen, X., Masek, J.G.: An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens. Environ. 114(11), 2610–2623 (2010)

    Article  Google Scholar 

  6. Hilker, T., Wulder, M.A., Coops, N.C., et al.: A new data fusion model for high spatial- and temporal-resolution mapping of forest disturbance based on Landsat and MODIS. Remote Sens. Environ. 113(8), 1613–1627 (2009)

    Article  Google Scholar 

  7. Zhukov, B., Oertel, D., Lanzl, F., Reinhackel, G.: Unmixing-based multisensor multiresolution image fusion. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 37(3), 1212–1226 (1999)

    Article  Google Scholar 

  8. Wu, M., Niu, Z., Wang, C., Wu, C., Wang, L.: Use of MODIS and Landsat time series data to generate high-resolution temporal synthetic Landsat data using a spatial and temporal reflectance fusion model. J. Appl. Remote. Sens. 6(1), 063507 (2012)

    Article  Google Scholar 

  9. Xue, J., Leung, Y., Fung, T.: A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images. Remote Sensing 9(12), 1310 (2017)

    Article  Google Scholar 

  10. Shen, H., Meng, X., Zhang, L.: An integrated framework for the spatio-temporal-spectral fusion of remote sensing images. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 54(12), 7135–7148 (2016)

    Article  Google Scholar 

  11. Huang, B., Song, H.: Spatiotemporal reflectance fusion via sparse representation. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 50(10), 3707–3716 (2012)

    Article  Google Scholar 

  12. Song, H., Liu, Q., Wang, G., Hang, R., Huang, B.: Spatiotemporal satellite image fusion using deep convolutional neural networks. IEEE J. Selected Topics Appli Earth Observat. Remote Sensing 11(3), 821–829 (2018)

    Article  Google Scholar 

  13. Li, Y., Li, J., He, L., Chen, J., Plaza, A.: A new sensor bias-driven spatio-temporal fusion model based on convolutional neural networks. Sci. China Inf. Sci. 63(4), 140302 (2020)

    Article  Google Scholar 

  14. Liu, X., Deng, C., Chanussot, J., Hong, D., Zhao, B.: StfNet: a two-stream convolutional neural network for spatiotemporal image fusion. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 57(9), 6552–6564 (2019)

    Article  Google Scholar 

  15. Tan, Z., Gao, M., Li, X., Jiang, L.: A flexible reference-insensitive spatiotemporal fusion model for remote sensing images using conditional generative adversarial network. IEEE Trans. Geosci. Remote Sens. 60, 1–13 (2021)

    Google Scholar 

  16. Chen, J., Wang, L., Feng, R., Liu, P., Han, W., Chen, X.: CycleGAN-STF: spatiotemporal fusion via CycleGAN-based image generation. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 59(7), 5851–5865 (2020)

    Article  Google Scholar 

  17. Song, B., Liu, P., Li, J., Wang, L., Zhang, L., He, G., et al.: MLFF-GAN: a multilevel feature fusion with GAN for spatiotemporal remote sensing images. IEEE Trans. Geosci. Remote Sens.Geosci. Remote Sens. 60, 1–16 (2022)

    Google Scholar 

  18. Zhu, X., Helmer, E.H., Gao, F., Liu, D., Chen, J., Lefsky, M.A.: A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sens. Environ. 172, 165–177 (2016)

    Article  Google Scholar 

  19. Xue, S., Qiu, W., Liu, F., Jin, X.: Wavelet-based residual attention network for image super-resolution. Neurocomputing 382, 116–126 (2020)

    Article  Google Scholar 

  20. Huang, H., He, R., Sun, Z., Tan, T.: Wavelet-srnet: a wavelet-based cnn for multi-scale face super resolution. In: IEEE International Conference on Computer Vision, pp. 1689–1697. IEEE, Venice, Italy (2017)

    Google Scholar 

  21. Hsu, W.Y., Jian, P.W.: Detail-enhanced wavelet residual network for single image super-resolution. IEEE Trans. Instrum. Meas.Instrum. Meas. 71, 1–13 (2022)

    MathSciNet  Google Scholar 

  22. Zhang, H., Jin, Z., Tan, X., Li, X.: Towards lighter and faster learning wavelets progressively for image super-resolution. In: 28th ACM International Conference on Multimedia, pp. 2113–2121. ACM, Seattle, USA (2020)

    Google Scholar 

  23. Emelyanova, I.V., McVicar, T.R., Van Niel, T.G., Li, L.T., Van Dijk, A.I.: Assessing the accuracy of blending Landsat–MODIS surface reflectances in two landscapes with contrasting spatial and temporal dynamics: A framework for algorithm selection. Remote Sens. Environ. 133, 193–209 (2013)

    Article  Google Scholar 

  24. Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13(4), 600–612 (2004)

    Article  Google Scholar 

  25. Yuhas, R.H., Goetz, A.F., Boardman, J.W.: Descrimination among semi-arid landscape endmembers using the spectral angle mapper (SAM) algorithm. In: The Third Annual JPL Airborne Geoscience Workshop, pp. 147–149. AVIRIS Workshop. California, USA (1992)

    Google Scholar 

Download references

Acknowledgement

This work was supported by the National Natural Science Foundation of China (NSFC) under Grant no. 42171302 and the Key R&D Program of Hubei Province, China (2021BAA185).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xinghua Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, X., Yu, S., Li, X., Li, S., Tan, Z. (2023). A Cross-Paired Wavelet Based Spatiotemporal Fusion Network for Remote Sensing Images. In: Lu, H., et al. Image and Graphics . ICIG 2023. Lecture Notes in Computer Science, vol 14359. Springer, Cham. https://doi.org/10.1007/978-3-031-46317-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46317-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46316-7

  • Online ISBN: 978-3-031-46317-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics