Loading [a11y]/accessibility-menu.js
Entropy-Constrained Implicit Neural Representations for Deep Image Compression | IEEE Journals & Magazine | IEEE Xplore

Entropy-Constrained Implicit Neural Representations for Deep Image Compression


Abstract:

Implicit neural representations (INRs) for various data types have gained popularity in the field of deep learning owing to their effectiveness. However, previous studies...Show More

Abstract:

Implicit neural representations (INRs) for various data types have gained popularity in the field of deep learning owing to their effectiveness. However, previous studies on INRs have only focused on recovering original representations. This letter investigated an image compression model based on INRs using a model compression technique for entropy-constrained neural networks. Specifically, the proposed model trains a multilayer perceptron (MLP) to overfit a single image and then uses its weights to optimize its compressed representation using additive uniform noise. Accordingly, the proposed model efficiently minimizes the size of the model weight in an end-to-end manner. This training optimization process is fairly desirable for adjusting the rate of distortion for image compression. In contrast to other model compression techniques, the proposed model is implemented without additional training process or memory cost. By introducing entropy loss, this letter demonstrated that the proposed model can be used to preserve high image quality while maintaining smaller model size. The experimental results demonstrated that the proposed model achieved comparable performance to conventional image compression models without incurring high storage costs.
Published in: IEEE Signal Processing Letters ( Volume: 30)
Page(s): 663 - 667
Date of Publication: 24 May 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.