Thresholding binary coding for image forensics of weak sharpening

https://doi.org/10.1016/j.image.2020.115956Get rights and content

Highlights

  • The effects of sharpening on the image and the corresponding unsharp mask are analyzed.

  • Thresholding binary coding is proposed for the texture analysis.

  • The constructed features are effective to identify weak sharpening.

  • The proposed algorithm is robust to post-JPEG compression and noise addition.

Abstract

Image forensics of sharpening has aroused the great interest of researchers in recent decades. The state-of-the-art techniques have achieved high accuracies of strong sharpening detection, while it remains a challenge to detect weak sharpening. This paper proposes an algorithm based on thresholding binary coding for image sharpening detection. The overshoot artifact introduced by sharpening enlarges the difference between the local maximum and minimum of both image pixels and unsharp mask elements, based on which the threshold local binary pattern operator is applied to capture the trace of sharpening. Then the patterns are coded according to the rotation symmetry invariance and the texture type. Features are extracted from the statistical distribution of the coded patterns and fed to the classifier for sharpening detection. In practice, two classifiers are constructed for the lightweight and offline applications respectively, one is a single Fisher linear discriminant (FLD) with 182 features, and the other is an ensemble classifier (EC) with 5460 features. The experimental results on BOSS, NRCS and RAISE datasets show that for weak sharpening detection, the FLD outperforms the CNN and SVMs with EPTC, EPBC, and LBP features, and using EC with TBCs features further improves the performance, which obtains better results than ECs with TLBP and SRM features. Besides, the proposed algorithm is robust to post-JPEG compression and noise addition and could differentiate sharpening from other manipulations.

Introduction

Image is one of the most important carriers of information spreading in judicial evidence, news communication, medical diagnosis, and remote sensing. Due to the rapid development of computer technology, modern image processing techniques make it very easy to deal with digital images for a certain application [1]. However, images may be maliciously tampered without visual abnormalities, which could lead to major consequences. Throughout history, hundreds of image tampering events are reported where images are forged to mislead public opinions.1 Therefore, it is practically significant to research image forensics technologies to validate image authenticity.

For decades, researchers have proposed many effective imageforgery detection methods, such as the tampered area location methods based on the inconsistencies of geometric properties [2], lighting environments [3] and camera noises [4]. Besides, forensics of image manipulations is also one of the research hotspots in image forensics, as image tampering usually involves image manipulations to hide the trace of tampering visually [5]. Forensics of image manipulations could help to reveal the image operation history and contribute to the tampered area location by exposing the different operation histories of image local areas [6]. Moreover, in some cases, only the original images are accepted. For example, images used as judicial evidence in court should not be altered. In recent years, the researches on image manipulation detection have made a great achievement. Effective methods are proposed to detect manipulations such as contrast changing [7], [8], sharpening [9], [10], [11], [12], [13], [14], [15], median filtering [16], blurring [17], resizing [18] and JPEG compression [19]. Furthermore, some methods are proposed for estimation parameters of manipulations such as contrast changing [20] and resizing [21], [22], which make the forensics results more convincible.

Among many image manipulations, sharpening is one of the most common operations, which has been integrated into popular image processing software, such as Photoshop and GIMP. Sharpening aims to highlight transitions in intensity by adding a mask to the targeted image, thus enhance texture regions [1]. After sharpening, the dark pixels become darker and the bright pixels become brighter in texture regions, which leads to the overshoot artifact. Over the past several years, the overshoot artifact has been taken as a unique feature of image sharpening, and been exploited for sharpening detection. In 2009, Cao et al. [9] proposed an image sharpening detection for the first time. The authors analyzed the gradient aberration of the gray histogram in the two ends and the overshoot artifact along the step edge, and then constructed two kinds of features for sharpening detection. In 2011, Cao et al. [10] further studied the overshoot, found the overshoot points in the crosswise pixel sequence along the side-planar edges and took the average strength of all overshoot points as the metric for sharpening detection. To capture the changes of edges caused by sharpening effectively, in 2013, Ding et al. [11] took advantage of the ability of local binary pattern (LBP) [23] for texture analysis, and use the histogram of rotation invariance LBP along the edges as features. In 2015, Ding et al. [12] considered that the changes of edges reflect mainly in the direction perpendicular to the edges, and then proposed edge perpendicular binary coding (EPBC) for sharpening detection. EPBC first selects the pixel sequence in the direction perpendicular to the edge, of which the center pixel is located at the edge, and then applies binary coding to the pixel sequence. The histogram of the codes is calculated as features. Afterward, the authors replaced binary coding by ternary coding [13], constructing edge perpendicular ternary coding (EPTC) features. In 2018, the authors improved EPTC by using number 3 as the base for coding instead of using number 2 as the base for two round coding [14]. However, all the methods mentioned above only consider the overshoot artifact along the edges while ignoring the changes in other types of texture regions. Besides, when the images are sharpened with weak strength, the overshoot artifact along the edge is not obvious, and the performances of these methods will degrade significantly. Recently, Ye et al. [15] built a classifier based on convolutional neural network (CNN), which achieved very good performance for strong sharpening detection.

This paper addresses the detection of image weak sharpening. We analyze the effects of sharpening on the image and the corresponding unsharp mask and point out that after sharpening the differences between the local maximum and minimum of both image pixels and unsharp mask elements become larger, based on which thresholding binary coding (TBC) is proposed for sharpening detection. First, the threshold local binary pattern operator is applied to the image or the unsharp mask for texture analysis. Then the texture patterns are reduced by pattern coding based on the rotation symmetry invariance and the texture type. The TBC features are extracted from the statistical distribution of the coded texture patterns. For the lightweight application where the computing resource is limited or real-time is required, a single Fisher linear discriminant (FLD) [24] is constructed with 182 features. While for offline image forensics where plenty of computing resource is available, an ensemble classifier (EC) [25] is constructed with 5460 features calculated by using various smoothing filters and thresholds. The experimental results show that the FLD outperforms the SVM [26] trained with EPTC [14], EPBC [12] and LBP [11] features, can distinguish between sharpening and other manipulations, and owns robustness to post-JPEG compression and noise addition. For weak sharpening detection, the FLD is superior to CNN [15]. Besides, using EC with the high dimensional features could further improve the performance of weak sharpening detection, which gets higher accuracies than using TLBP [27] and SRM [28] features.

The rest of this paper is organized as follows. The effects of sharpening on the image and unsharp mask are analyzed in Section 2, based on which the thresholding binary coding algorithm for sharpening detection is proposed in Section 3. Experiments are presented in Section 4. Finally, conclusions are made in Section 5.

Section snippets

Effects of sharpening on unsharp mask

The form of image sharpening could be represented as Is=Io+λG,where Is, Io and G are the sharpened image, the original image and a mask respectively, and λ is a positive factor which controls the sharpening strength. A small value of λ de-emphasizes the contribution of the mask, which corresponds to the weak sharpening. The mask can be obtained by applying Laplacian operator to the original image G=2Io,or by subtracting the original image by its smoothed version G=IoĪo=IofLowPassMethodIo.

Thresholding binary coding algorithm

Aiming to capture the trace of sharpening, this paper proposes an algorithm called thresholding binary coding (TBC) to construct features for sharpening detection. Fig. 4 shows the process of feature construction. The input of the algorithm can be either the image or the calculated USM. Specially, the image could be taken as the USM which is calculated by subtracting a pure image (excessively smoothed version) from the image. In this way, the input could be unified as the result of USM

Experimental setup

Images from three public databases BOSS [32], NRCS2 and RAISE [33] are used for performance testing. 1000 images are randomly selected from image database NRCS, converted to grayscale, and then cropped into size 512 × 512. BOSS contains 10,000 images with fixed size 515 × 512 coming from rescaled and cropped natural images of various sizes. For the performance testing on high-resolution images, 2000 images from RAISE_2k are

Conclusions

Image sharpening introduces overshoot artifacts in texture regions, which enlarges the difference between the local maximum and minimum of both image pixels and USM elements. To exploit the feature for sharpening detection, this paper proposes an algorithm based on thresholding binary coding (TBC). The TLBP operator is applied to the image or USM to obtain the texture patterns which are further reduced by ‘rsim’ mapping based on the rotation symmetry invariance and the texture type. Then

CRediT authorship contribution statement

Ping Wang: Methodology, Software, Formal analysis, Writing - original draft, Visualization. Fenlin Liu: Conceptualization, Resources, Writing - review & editing, Supervision, Funding acquisition. Chunfang Yang: Validation, Investigation, Resources, Writing - review & editing, Project administration, Funding acquisition.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China (No. 61772549, 61872448, U1736214, 61602508, and 61601517), and the National Key R&D Program of China (No. 2016YFB0801303 and 2016QY01W0105). The authors sincerely appreciate Prof. Zhu Guopu for kindly offering the code of EPBC [12] for comparison and thank the editor and the anonymous reviewers for their valuable comments which help improve the work.

References (34)

  • PengB. et al.

    Image forensics based on planar contact constraints of 3D objects

    IEEE Trans. Inf. Forensics Secur.

    (2018)
  • KeeE. et al.

    Exposing photo manipulation from shading and shadows

    ACM Trans. Graph.

    (2014)
  • FerraraP. et al.

    Image forgery localization via fine-grained analysis of CFA artifacts

    IEEE Trans. Inf. Forensics Secur.

    (2012)
  • CaoG. et al.

    Detection of image sharpening based on histogram aberration and ringing artifacts

  • CaoG. et al.

    Unsharp masking sharpening detection via overshoot artifacts analysis

    IEEE Signal Process. Lett.

    (2011)
  • DingF. et al.

    A novel method for detecting image sharpening based on local binary pattern

  • DingF. et al.

    Edge perpendicular binary coding for USM sharpening detection

    IEEE Signal Process. Lett.

    (2015)
  • View full text