Skip to main content
Log in

Implementation and evaluation of a remote authentication system using touchless palmprint recognition

  • Regular Paper
  • Published:
Multimedia Systems Aims and scope Submit manuscript

Abstract

When a cellular phone is lost or stolen, it may be used improperly or the personal information may be stolen from it by a malicious user. Biometric authentication such as palmprint recognition is the strongest of the personal authentication technologies designed to prevent such misuse. In biometric authentication, when compared with a local authentication model, a remote authentication model has several advantages such as direct authentication and authentication levels. Ito et al. proposed several palmprint recognition schemes using correspondence matching based on the phase-only correlation. However, these schemes require a palmprint image to be captured with the hand touching the dedicated device, while palmprint images must be captured without such physical contact when using cellular phones. Thus, these schemes cannot be applied to cellular phones since there are large positioning gaps and large differences in brightness and distortion between the images. Furthermore, they have not been implemented in cellular phones and their performances have not been evaluated either. In this paper, we adopt a remote authentication model from the two types of biometric authentication incorporating the above advantages and propose a remote system between a cellular phone and an authentication server. We implement the proposed system using two different types of Android terminal as the terminal on the user side. We also show the validity of the proposed system by examining and confirming the accuracy and processing time. We furthermore discuss the problem of an impersonation attack on the proposed system and consider solutions to this problem from the viewpoints of security and usability. Then, we adopt a palmprint recognition scheme as a biometric authentication scheme and, in particular, use a palmprint recognition algorithm that incorporates Yörük et al.’s preprocessing technique to Ito et al.’s and Iitsuka et al.’s schemes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. The aim of this experiment is to confirm the performance when adjusting the position. We use the palmprint images captured with the hand touching the dedicated device, which has smaller positioning gaps. For these images, we use the preprocessing technique of Zhang’s research team [11] to obtain the palmprint region.

References

  1. Jain, A., Bolle, R., Pankanti, S.: In: Jain, A., Bolle, R., Pankanti, S. (eds.) BIOMETRICS: Personal Identification in Networked Society, pp. 1–41. Kluwer Academic Publishers, New York (2002)

  2. Zhang, D.D.: Palmprint Authentication. Kluwer Academic Publishers, Massachusetts (2004)

    Google Scholar 

  3. Song, Y., Lee, C., Kim, J.: In: Proceedings of 2004 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS 2004), pp. 524–527. IEEE Computer Society (2004)

  4. Lee, C., Lee, S., Kim, J.: In: Proceedings of Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition (SSPR2006 and SPR2006). LNCS, vol. 4109, pp. 358–365. Springer, New York (2006)

  5. Baltscheffsky, P., Anderson, P.: In: Proceedings of 1986 International Carnahan Conference on Security Technology (ICCST 1986), pp. 229–234. IEEE Computer Society (1986)

  6. Kung, S.Y., Lin, S.H., Fang, M.: In: Proceedings of 1995 IEEE Workshop on Neural Networks for Signal Processing, pp. 323–332. IEEE Computer Society (1995)

  7. Boles, W., Chu, S.: In: Proceedings of IEEE Region 10 Annual Conference. Speech and Image Technologies for Computing and Telecommunications (TENCON’97) , pp. 295–298. IEEE Computer Society (1997)

  8. Shu, W., Zhang, D.D.: In: Proceedings of Digital Image Computing: Techniques and Applications (DICTA 1997), pp. 551–554. Australian Pattern Recognition Society (1997)

  9. Shu, W., Zhang, D.D.: Opt. Eng. 37(8), 2359 (1998)

    Article  Google Scholar 

  10. Kong, W.K., Zhang, D.D.: In: Proceedings of 16th International Conference on Pattern Recognition (ICPR 2002), pp. 807–810. IEEE Computer Society (2002)

  11. Zhang, D.D., Kong, W.K., You, J., Wong, M.: IEEE Trans. Pattern Anal. Mach. Intell. 25(9), 1041 (2003)

    Article  Google Scholar 

  12. Kong, A.W.K., Zhang, D.D.: In: Proceedings of 17th International Conference on Pattern Recognition (ICPR 2004), pp. 520–523. IEEE Computer Society (2004)

  13. Kong, A., Zhang, D.D., Kamel, M.: Pattern Recognit. 39(3), 478 (2006)

    Article  MATH  Google Scholar 

  14. Jia, W., Huang, D.S., Zhang, D.D.: Pattern Recognit. 41(5), 1504 (2008)

    Article  MATH  Google Scholar 

  15. Zhang, D.D., Guo, Z., Lu, G., Zhang, L., Zuo, W.: IEEE Trans. Instrum. Meas. 59(2), 480 (2010)

    Article  Google Scholar 

  16. Duta, N., Jain, A.K., Mardia, K.V.: Pattern Recognit. Lett. 23(4), 477 (2002)

    Article  MATH  Google Scholar 

  17. Han, C.C., Cheng, H.L., Lin, C.L., Fan, K.C.: Pattern Recognit. 36(2), 371 (2003)

    Article  Google Scholar 

  18. Connie, T., Jin, A.T.B., Ong, M.G.K., Ling, D.N.C.: Image Vis. Comput. 23(5), 501 (2005)

    Article  Google Scholar 

  19. Sun, Z., Tan, T., Wang, Y., Li, S.Z.: In: Proceedings of 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), pp. 279–284. IEEE Computer Society (2005)

  20. Ribaric, S., Fratric, I.: IEEE Trans. Pattern Anal. Mach. Intell. 27(11), 1698 (2005)

    Article  Google Scholar 

  21. Lin, C.L., Chuang, T.C., Fan, K.C.: Pattern Recognit. 38(12), 2639 (2005)

    Article  Google Scholar 

  22. Hu, D., Feng, G., Zhou, Z.: Pattern Recognit. 40(1), 339 (2007)

    Article  MATH  Google Scholar 

  23. Hennings-Yeomans, P.H., Kumar, B.V.K.V., Savvides, M.: IEEE Trans. Inf. Forensics Secur. 2(3), 613 (2007)

    Article  Google Scholar 

  24. Ito, K., Aoki, T., Nakajima, H., Kobayashi, K., Higuchi, T.: In: Proceedings of 13th IEEE International Conference on Image Processing (ICIP 2006), pp. 2669–2672. IEEE Computer Society (2006)

  25. Ito, K., Aoki, T., Nakajima, H., Kobayashi, K., Higuchi, T.: In: Proceedings of 2006 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS 2006), pp. 215–218. IEEE Computer Society (2006)

  26. Ito, K., Aoki, T., Nakajima, H., Kobayashi, K., Higuchi, T.: IEICE transactions on fundamentals of electronics. Commun. Comput. Sci. E91-A(4), 1023 (2008)

    Google Scholar 

  27. Iitsuka, S., Ito, K., Aoki, T.: In: Proceedings of 19th International Conference on Pattern Recognition (ICPR 2008), pp. 1–4. IEEE Computer Society (2008)

  28. Iitsuka, S., Miyazawa, K., Aoki, T.: In: Proceedings of 16th IEEE International Conference on Image Processing (ICIP 2009), pp. 1973–1976. IEEE Computer Society (2009)

  29. Ito, K., Iitsuka, S., Aoki, T.: In: Proceedings of 16th IEEE International Conference on Image Processing (ICIP 2009), pp. 1977–1980. IEEE Computer Society (2009)

  30. Yörük, E., Konukoğlu, E., Sankur, B., Darbon, J.: IEEE Trans. Image Process. 15(7), 1803 (2006)

    Article  Google Scholar 

  31. Ota, H., Kiyomoto, S., Tanaka, T.: IEICE transactions on fundamentals of electronics. Commun. Comput. Sci. E88-A(1), 287 (2005)

    Article  Google Scholar 

  32. The Hong Kong Polytechnic University (PolyU) Palmprint Database. http://www4.comp.polyu.edu.hk/∼biometrics/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haruki Ota.

Appendix: Palmprint recognition algorithm

Appendix: Palmprint recognition algorithm

This appendix describes the palmprint recognition algorithm used in this paper.

This palmprint recognition algorithm consists of a preprocessing stage of six steps including Yörük et al.’s technique and a matching stage of two steps.

1.1 Preprocessing stage

This subsection describes the six-step preprocessing stage in the palmprint recognition algorithm.

The preprocessing stage consists of the following six steps, where the fifth step includes Yörük et al.’s preprocessing technique.

  1. 1.

    Extraction of an image.

  2. 2.

    Reduction of the extracted image.

  3. 3.

    Flesh color detection based on HSV color system.

  4. 4.

    Opening.

  5. 5.

    Detection of key points.

  6. 6.

    Extraction of palmprint region.

1.1.1 Extraction of image

The right half of an input image is extracted, because only the right half of the input image is required to detect the key points for the extraction of the palmprint region. It is possible to shorten the processing time with this extraction. In this algorithm, the size of the captured image is 1,280 × 960 pixels, the size of the input image that is downsampled is 640 × 480 pixels and the size of the extracted image is 320 × 480 pixels. The guide for image capture is displayed when a user captures an image of his/her hand. At this time, background colors are colors other than flesh color and its related colors, such as red, orange and yellow.

1.1.2 Reduction of extracted image

The image is reduced from the extracted image to half size, which can further shorten the processing time. In this algorithm, the size of the reduced image is 160 × 240 pixels.

1.1.3 Flesh color detection based on HSV color system

The reduced image is initially represented in the RGB color system, then converted to the HSV color system, enabling robust detection of flesh color for a conversion of a bright value. The palm can then be detected by its flesh color. In this algorithm, the color of some image blocks is judged to be flesh color when the H channel of these blocks satisfies the following conditions:

$$ \left\{\begin{array}{l} 0 \leq H \leq 50\\ 300 \leq H \leq 360\\ \end{array} \right. $$

The image is converted into a binary image of the flesh color domain and the other color domain using the H channel.

1.1.4 Opening

Flesh color may be included in domains other than the palm when that color is detected. These domains are represented as the small connected components. Then, the small connected components other than the palm are eliminated by the opening processing. The opening processing consists of erosion processing and dilation processing. Erosion and dilation are fundamental morphological operations. Erosion is a process to compute the minimum value of pixels inside the kernel region, and to replace the target pixel with this value. Dilation is a process to compute the maximum value of pixels inside the kernel region, and to replace the target pixel with this value. Opening processing involves carrying out the erosion process several times, and then repeating the dilation process the same number of times. In this algorithm, erosion and dilation are each carried out once, and a disk-type structuring element with a radius of 3 pixels is used as the kernel.

1.1.5 Detection of key points

The vertices formed by the junctions of the index and middle fingers, middle and ring fingers and ring and little fingers are detected in order to extract the palmprint region. Figure 12 shows an example of palmprint region extraction.

  1. 1.

    A chain code is generated for the above-mentioned binary image by determining the center of the left end of this image as the starting point (the square point of Fig. 12). The coordinates of the boundaries of the palm can be obtained by generating the chain code.

  2. 2.

    The Euclidean distances between the starting point and coordinates of the boundaries are computed, and illustrated as a graph. In this case, impulse noises are eliminated by a median filter. The valleys in the graph are detected using the slopes of the line passing through the key points. It is possible to detect the vertices between the fingers by detecting the valleys in the graph, since these valleys terminate at the vertices between the fingers. In this algorithm, the vertices between the index and middle fingers and ring and little fingers are determined as the key points (the circle points of Fig. 12).

Fig. 12
figure 12

Example of palmprint region extraction

1.1.6 Extraction of palmprint region

The perpendicular bisectors of the segments connecting the datum points are computed. A point that has some fixed distance from their intersection point is determined to be the centroid (the cross of Fig. 12). A rectangular region image is extracted by setting the centroid as the center of the palmprint region (the square frame of Fig. 12). The extracted image is normalized into an image with 160 × 160 pixels, and is converted into a grayscale image. The RGB image is converted into a YIQ color system image, and the Y channel determines the grayscale image. The grayscale image is called the palmprint region. It is possible to normalize the rotation, expansion and reduction and translation between the palmprint regions to some extent by determining the centroid using the key points.

1.2 Matching stage

This subsection describes the two-step matching stage in the palmprint recognition algorithm.

The matching stage consists of the following two steps:

  1. 1.

    Mapping between the images.

  2. 2.

    Computation of the matching score.

1.2.1 Mapping between images

There are nonlinear distortion and projective transformation between the palm images. These can be approximated by local translation. It is possible to deal with nonlinear distortion and projective transformation by using correspondence matching based on the POC. Also, it is possible to deal with the large positioning gaps, which cannot be normalized in the preprocessing stage. In this algorithm, the block size is 32 × 32 pixels and the number of corresponding points to be found is 16.

1.2.2 Computation of matching score

First, the block image with 32 × 32 pixels is extracted by setting the datum point and corresponding points at the center of this block. The normalized cross power spectrum between corresponding local blocks is computed with the inherent frequency band of palmprint images. Next, the average of all the normalized cross power spectrums is computed. Then, the average BLPOC function is computed by inverse 2-dimensional discrete Fourier transform of the average normalized cross power spectrum. Finally, the highest peak value of the average BLPOC function is computed as the matching score.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ota, H., Aoyama, S., Watanabe, R. et al. Implementation and evaluation of a remote authentication system using touchless palmprint recognition. Multimedia Systems 19, 117–129 (2013). https://doi.org/10.1007/s00530-012-0283-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00530-012-0283-z

Keywords

Navigation