Elsevier

Computers & Security

Volume 23, Issue 7, October 2004, Pages 606-614
Computers & Security

Personalised cryptographic key generation based on FaceHashing

https://doi.org/10.1016/j.cose.2004.06.002Get rights and content

Abstract

Among the various computer security techniques practice today, cryptography has been identified as one of the most important solutions in the integrated digital security system. Cryptographic techniques such as encryption can provide very long passwords that are not required to be remembered but are in turn protected by simple password, hence defecting their purpose. In this paper, we proposed a novel two-stage technique to generate personalized cryptographic keys from the face biometric, which offers the inextricably link to its owner. At the first stage, integral transform of biometric input is to discretise to produce a set of bit representation with a set of tokenised pseudo random number, coined as FaceHash. In the second stage, FaceHash is then securely reduced to a single cryptographic key via Shamir secret-sharing. Tokenised FaceHashing is rigorously protective of the face data, with security comparable to cryptographic hashing of token and knowledge key-factor. The key is constructed to resist cryptanalysis even against an adversary who captures the user device or the feature descriptor.

Introduction

Security is a major concern in today's digital era. Cryptography has been recognized as one of the most popular technology to solve the four principle security goals, i.e. privacy, authentication, integrity and authorization (Smith, 1997). In general, data will be secured using symmetric crypto system, while public-key system will be deployed for digital signatures and for secure key exchange between users. Cryptographic techniques such as encryption can provide very long passwords (strong cryptographic keys) which are not required to be remembered but are in turn protected by a simple password, but once the password is compromised, the entire solution may fall apart.

A biometric is a feature measured from the human body that is distinguishing enough to be used for user authentication. Examples include voice, handwriting, face, eye and face and signature. Biometric offers to inextricably link the authenticator to its owner, something passwords or token cannot do, since they can be lent or stolen. When used in cryptographic key generation context, this inextricable link can be adopted to replace password to rectify the aforementioned problem. In the simplest biometrics-cryptographic key application, an external specific crypto key may be stored as a portion of a user's particular, i.e. user name, biometric template, access privileges, etc. which may be released upon a successful match. This is fine in key management but secure only when the user's particular is placed and the matching is done in secure region. Apparently, it is useful to generate cryptographic key directly from the user-specific biometrics. This can be done by deriving some independent, non-recovery parameters that are solely tied to a particular person, and thus crypto techniques can utilize these parameters as crypto key for encryption or decryption purposes. Basically, the biometrics based key generation process could be made either in a front-end approach or in a back-end approach (Peyravian et al., 1999). In front-end approach, the designation of the initial seed value of PRNG (Pseudo Random Number Generator) is modified or extended to include a user-specific data, i.e. biometrics component whereas in the back-end approach, the random numbers which are produced by the PRNG are processed to make it dependent on biometrics data. However, the representation problem is simply that biometric data are continuous and statistically frustrated, while cryptographic parameters are discrete and have zero-uncertainty. Biometric consistency measured from the difference between reference and test data is similar but never equal and hence inadequate for cryptographic purposes which require exact reproduction.

The first notion of using biometric parameter directly as a crypto key was proposed by Bodo (1994). However, instability of biometrics during the course of time and non-appreciable equal error rate—the error rate occurring when the decision threshold of a system is set so that the proportion of false rejections will be approximately equal to the proportion of false acceptances—hinders its direct use as a crypto key. Also if the key is ever compromised, the use of that biometrics will be lost irrevocably which is inconsistent with a system that requires periodic updating. The Soutar et al. (1999) research outlines cryptographic key recovery from the integral correlation of fingerprint data and previously registered Bioscrypts. Bioscrypts result from the mixing of random and user-specific data, thereby preventing recovery of the original fingerprint data with data capture uncertainties addressed via multiply-redundant majority-result table lookups. The Soutar et al. formulation is nevertheless restrictive in that keys are externally specified and then recovered, rather than internally computed. The Davida et al. (1998) formulation outlines cryptographic signature verification of iris data without stored references. This is accomplished via open token-based storage of user-specific Hamming codes necessary to rectify offsets in the test data, thereby allowing verification of the corrected biometrics. Such self-correcting biometric representations are applicable towards key computation, with recovery of iris data via analysis of these codes prevented by complexity theory. Monrose et al. key computation from user-specific keystroke (Monrose et al., 1999) and voice (Monrose et al., 2001) data is both deterministic and probabilistic. The methodology (broadly similar in both cases) specifies the deterministic concatenation of single bit outputs based on logical characterizations of the biometric data, in particular whether user-specific features are below (0) or above (1) some population-generic threshold. This accumulation of 0 and 1 response with the additional possibility of an indeterminate (∅) output for certain features is then used in conjunction with randomized lookup tables formulated via Shamir secret-sharing (Shamir, 1979).

This paper proposed a novel cryptographic key computation technique from face biometrics. The proposed technique can be characterized as having two stages: feature extraction and key computation (cryptographic key interpolation) Goh and Ngo, 2003. In the feature extraction stage, certain features of raw input from a biometric-measuring device are examined and used to compute a set of bit string, coined as FaceHash. The key computation stage develops a cryptographic key from the FaceHash and stored cryptographic data in the device. If two FaceHash are sufficiently similar, then the same cryptographic key will be generated from them. We provide the experimental results to illustrate high stability of FaceHash, which is vital for key generation. FaceHash is rigorously protective of the face data, with security comparable to cryptographic hashing of token and knowledge key-factor. The key is constructed to resist cryptanalysis even against an adversary who captures the user device or the feature descriptor.

The following section provides an overview of our approach while the next three sections present its components in detail. Further the experimental results and a security analysis in terms of key-factor independent and non-recovery are discussed. Finally, the last section gives the concluding remarks of this paper.

Section snippets

Biometric based cryptographic key derivation overview

As aforementioned, biometrics and cryptography are the two opposed paradigms and this had motivated the formulation of highly offset-tolerant discretisation methodology—FaceHashing. The process is crucial to our biometric based crypto key derivation through our two-stage scheme as illustrated in Fig. 1.

The proposed scheme contains two stages while stage one could be subdivided into two parts:

  • Stage 1:

    Feature extraction

    • (a)

      The first step is to transform the raw face data, IN, where N is the image

Stage 1(a): wavelet Fourier-Mellin transform feature construction

The wavelet decomposition of a signal f(x) can be obtained by convolution of signal with a family of real orthonormal basis, ψa,b(x)(Wψf(x))(a,b)=|a|1/2f(x)ψ(xba)xwhere a, b and a  0 are the dilation parameter and the translation parameter, respectively. Two-dimensional wavelet transform leads to a decomposition of approximation coefficients at level j  1 in four components: the approximations at level j, Lj and the details in three orientations (horizontal, vertical and diagonal), Djvertical

Stage 1(b): biometrics discretisation

At this stage, the invariant face feature, ΓM with M, the log-polar spatial frequency dimension, is reducing down to a set of single bit, b{0,1}lb, with lb the length of the bit string via a uniform distributed secret pseudo random number, r{1,1} that is uniquely associated with a token.

Specifically, let ΓM,

  • (1)

    Use token to generate a set of pseudo random number, {riM|i=1,,lb}.

  • (2)

    Apply the Gram-Schmidt process to transform the basis {riM|i=1,,lb} into an orthonormal set of matrices {riM

Cryptographic key interpolation (Shamir's (2, ls)-thresholding scheme)

The limited uncertainty of FaceHash, b{0,1}lb is addressed via Shamir secret-sharing (Shamir, 1979); which uses modular polynomial p(x) : Zq  Zq where q is a prime for secret encoding p(0) = kc i.e. the 2lb  Zq cryptographic key in our context. In the simplest case of linear polynomials, this allows secret recovery viakc=H(b)p(H(r))H(b)H(r)+H(r)p(H(b))H(r)H(b)(modq)where r be the orthonormalized token number used to compute the FaceHash, b and H( ) is the hashing function. The details of the

Experiments and discussion

In this section, we provide the experimental results to illustrate highly error tolerant of FaceHash, which is vital for key generation. The proposed method has been evaluated in terms of their same user (genuine)/different user (imposter) population distribution achieved in Essex Faces 94 and Olivetti Face Database (ORL database). Essex Faces 94 contains frontal face photos taken from a fixed camera distance and under the uniform background and the illumination, with the subjects asked to

Security analysis

The security of H:2lb×MZq, where q is a prime number, transformation should be evaluated in terms of key-factor:

  • Independence, such as evaluation of H(r, Γ) in the absence of r or Γ.

  • Non-recovery of r or Γ given specific value of Γ(r, Γ) and the other factor.

With the benchmark being cryptographic hashingh(r,k):2l×l'2l2land secret knowledge k, where a = h(r, k) cannot be computed without both r,k factors, so that adversarial deduction is no more probable than random guessing of order 1/2l.

Concluding remarks

This paper described an error-tolerant biometrics feature discretisation methodology—FaceHashing that leads to the cryptographic key computation based on face biometrics with uniquely tokenised pseudo random number. The FaceHashing has significant functional advantages over solely biometrics or token usage, such as extremely clear separation of the genuine and the imposter populations and thereby introduce an error free decision. H(r, Γ) is furthermore highly secure with respect to

Andrew Teoh Beng Jin obtained his BEng (Electronic) in 1999 and PhD degree in 2003 from National University of Malaysia. He is currently a lecturer in Faculty of Information Science and Technology, Multimedia University. He held the post of co-chair (Biometrics Division) in Center of Excellent in Biometrics and Bioinformatics in the same university. His research interest is in multimodal biometrics, pattern recognition, multimedia signal processing and Internet security.

References (15)

  • C. Nastar et al.

    Flexible images: matching and recognition using learned deformations

    Comput Vision Image Understanding

    (1997)
  • M. Peyravian et al.

    Generating user-based cryptographic keys and random numbers

    Comput Secur

    (1999)
  • Albert Bodo. Method for producing a digital signature with aids of a biometric feature. German Patent DE 42 43 908 A1;...
  • Andrew TBJ, David NCL. Integrated wavelet and Fourier-Mellin invariant feature in fingerprint verification system. ACM...
  • Daugman John. Biometric decision landscapes, Technical Report, No. 482, Cambridge University Computer Laboratory;...
  • Davida GI, Frankel Y, Matt BJ. On enabling secure applications through offline biometric identification. Proceedings of...
  • A. Goh et al.

    Computation of cryptographic keys from face biometrics

    (2003)
There are more references available in the full text version of this article.

Cited by (110)

  • Sensitivity and uniformity in statistical randomness tests

    2022, Journal of Information Security and Applications
  • Lightweight deep learning model to secure authentication in Mobile Cloud Computing

    2022, Journal of King Saud University - Computer and Information Sciences
  • Biometric masterkeys

    2022, Computers and Security
    Citation Excerpt :

    The transformations used in this paper are based on random projection, possibly with a binarization step. These cancelable transformations were first proposed in Teoh et al. (2008, 2004) with the biohashing algorithm. Other projections have been later proposed, as in Feng et al. (2010); Pillai et al. (2011); Wang and Plataniotis (2010).

  • A novel optical two-factor face authentication scheme

    2019, Optics and Lasers in Engineering
    Citation Excerpt :

    The cancelable biometric scheme is based on the noninvertible transform to be utilized on the original biometric for generating a different representation of the biometric templates that have the characteristics of cancellability when compromise it [6]. In addition, a new BioHashing scheme based on two-step authentications are reported in which the user biometric feature and a tokenized random number (generated by a Hash key) are combined [7–17]. This tokenized random number (TRN) further needs to be stored in a USB token or smart card.

View all citing articles on Scopus

Andrew Teoh Beng Jin obtained his BEng (Electronic) in 1999 and PhD degree in 2003 from National University of Malaysia. He is currently a lecturer in Faculty of Information Science and Technology, Multimedia University. He held the post of co-chair (Biometrics Division) in Center of Excellent in Biometrics and Bioinformatics in the same university. His research interest is in multimodal biometrics, pattern recognition, multimedia signal processing and Internet security.

David Chek Ling Ngo is an Associate Professor and the Dean of the Faculty of Information Science & Technology at Multimedia University, Malaysia. He has worked there since 1999. Ngo was awarded a BAI in Microelectronics & Electrical Engineering and PhD in Computer Science in 1990 and 1995 respectively, both from Trinity College Dublin. Ngo's research interests lie in the area of Automatic Screen Design, Aesthetic Systems, Biometric Encryption, and Knowledge Management. He is author and co-author of over 20 invited and refereed papers. He is a member of Review Committee of Displays and Multimedia Cyberscape.

Alwyn Goh is an experienced and well-published researcher in biometrics, cryptography and information security. His work is recognized by citations from the Malaysian National Science Foundation and the European Federation of Medical Informatics. He previously lectured Computer Sciences at Universiti Sains Malaysia where he specialised in data-defined problems, client server computing and cryptographic protocols. Goh has a Masters in Theoretical Physics from the University of Texas, and a Bachelors in Electrical Engineering and Physics from the University of Miami.

1

Tel.: +606 252 3111/3485.

2

Tel.: +603 58910155.

View full text