11 September 2021 Huffman-based lossless image encoding scheme
Erdal Erdal
Author Affiliations +
Abstract

The data produced in today’s Internet and computer world are expanding their wings day by day. With the passage of time, storage and archiving of this data are becoming a significant problem. To overcome this problem, attempts have been made to reduce data sizes using compression methods. Therefore, compression algorithms have received great attention. In this study, two efficient encoding algorithms are presented and explained in a crystal-clear manner. In all compression algorithms, frequency modulation is used. In this way, the characters with the highest frequency after each character are determined and the Huffman encoding algorithm is applied to them. In this study, the compression ratio (CR) is 49.44%. Moreover, 30 randomly selected images in three different datasets consisting of USC-SIPI, UCID, and STARE databases have been used to evaluate the performance of the algorithms. Consequently, excellent results have been obtained in all test images according to well-known comparison algorithms such as the Huffman encoding algorithm, arithmetic coding algorithm, and LPHEA.

© 2021 SPIE and IS&T 1017-9909/2021/$28.00© 2021 SPIE and IS&T
Erdal Erdal "Huffman-based lossless image encoding scheme," Journal of Electronic Imaging 30(5), 053004 (11 September 2021). https://doi.org/10.1117/1.JEI.30.5.053004
Received: 10 April 2021; Accepted: 30 August 2021; Published: 11 September 2021
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Computer programming

Image compression

Algorithm development

Chromium

Databases

Medical imaging

Binary data

RELATED CONTENT


Back to Top