Loading [a11y]/accessibility-menu.js
UnmixDiff: Unmixing-Based Diffusion Model for Hyperspectral Image Synthesis | IEEE Journals & Magazine | IEEE Xplore

UnmixDiff: Unmixing-Based Diffusion Model for Hyperspectral Image Synthesis


Abstract:

The scarcity of hyperspectral images (HSIs) hinders the development of processing methods and downstream applications. HSI synthesis, which aims to generate realistic sam...Show More

Abstract:

The scarcity of hyperspectral images (HSIs) hinders the development of processing methods and downstream applications. HSI synthesis, which aims to generate realistic samples from the existing datasets, is undoubtedly a prospective and economical solution for the HSI data shortage problem. Inspired by the impressive performance of the diffusion model (DM) in image synthesis tasks, this article initiatively proposes an unmixing diffusion (UnmixDiff) model for high-quality HSI generation. The method starts with training an unmixing network to learn the distribution characteristic of objects (abundance). By incorporating the unmixing autoencoder into the DM, the UnmixDiff transforms the HSI generation into the abundance domain, which maintains the consistency of the generated spectral profile, reduces the computational complexity, and introduces a clear physical interpretation into the hyperspectral image synthesis tasks. After that, we construct a diffusion generation model in abundance space to generate realistic abundance maps. Instead of synthesizing original hyperspectral images, the proposed UnmixDiff synthesizes abundance maps to simulate objects’ distribution rather than superficial textures. In this way, realistic HSI samples are generated by mixing the synthesized abundance with the scene end-members. With the comparative experiments, the proposed method achieves state-of-the-art performance in HSI synthesis tasks, effectively alleviating the HSI data scarcity and supporting widespread HSI applications. The code is available at https://github.com/yuyang95/UnmixingDM.
Article Sequence Number: 5524018
Date of Publication: 09 July 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.