Skip to main content

Advertisement

Log in

Societal and Ethical Implications of Anti-Spoofing Technologies in Biometrics

  • Original Paper
  • Published:
Science and Engineering Ethics Aims and scope Submit manuscript

Abstract

Biometric identification is thought to be less vulnerable to fraud and forgery than are traditional forms of identification. However biometric identification is not without vulnerabilities. In a ‘spoofing attack’ an artificial replica of an individual’s biometric trait is used to induce a system to falsely infer that individual’s presence. Techniques such as liveness-detection and multi-modality, as well as the development of new and emerging modalities, are intended to secure biometric identification systems against such threats. Unlike biometrics in general, the societal and ethical issues raised by spoofing and anti-spoofing techniques have not received much attention. This paper examines these issues.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. Many excellent overview of biometrics are available, including Jain et al. (2008, 2011).

  2. Examples of indirect attacks include the use of cyber-attacks, hacking, or re-engineering. Indirect attacks typically require some level of technical expertise.

  3. Not all direct attacks involve fake biometrics. In a “coercion attack” an individual is forced to present his or her trait against their will. In other attacks a genuine but detached trait—such as a severed finger—is used.

  4. Techniques for spoofing the ICAO modalities have been demonstrated many times. For examples see, on face: Pan et al. (2008), on fingerprint: Wehde and Beffel (1924), Van der Putte and Keuning (2000), Matsumoto et al. (2002), on iris: Wei et al. (2008); for an overview: Nixon et al. (2007).

  5. The European Commission-funded research project TABULA RASA (www.tabularasa-euproject.org) seeks to develop draft standards for investigation, to propose countermeasures, and to investigate the potential of biometric modalities that may be inherently less vulnerable to direct attacks.

  6. These factors are discussed by one of the present authors in a report on a series of interviews concerning societal and ethical aspects of anti-spoofing research for the TABULA RASA project. Research in this paper draws in part upon that report.

  7. There is a clear need for coordination and standardisation of the framework within which vulnerabilities and countermeasures are conceptualised and developed. The vulnerability database of the National Institute of Standards and Technologies in the USA offers a standardisation example to which Europe currently lacks an equivalent. (It should be said that standardisation is a problem common to biometric systems and applications in general, not only anti-spoofing research.) The TABULA RASA project addresses the need for a draft set of standards in this area.

  8. Some of these drawbacks could be overcome by “agent-based” or “adaptive” biometric systems (Deravi et al. 2003).

  9. For recent treatment of emerging issues in “second generation” biometrics see Mordini and Tzovaras (2012).

  10. The use of biometrics is commonly associated with fundamental ethical objections focused on human dignity. While dignity plays an implicit role in this paper, we will not venture into existing controversies. We simply assume that the use of biometrics is, subject to appropriate safeguards and standards, in principle acceptable. One plausible line of justification appeals to the potentially major role of biometrics in solving problems of global identity management (Mordini and Rebera 2012).

  11. The Directive defines “personal data” as “information relating to an identified or identifiable natural person”, and its processing as “any operation or set of operations […] performed upon [it]” (EU 1995).

  12. In January 2012 the European Commission proposed a comprehensive reform of data protection rules in the EU. The Data Protection Directive will be replaced by a “General Data Protection Regulation” (EC 2012).

  13. The Article 29 Working Party advises the European Commission and EU Member States on data protection issues.

  14. For a similar and analogous use of “context” see Nissenbaum (2009).

  15. If it is difficult to deny that biometric data could be revealing of data concerning health, it is perhaps even harder to deny that certain forms of it could be revealing of ethnic or racial background. The Article 29 Working Party contends that “in biometric systems based on face recognition, data revealing racial or ethnic origin may be [meaning: is liable to be (rather than: is allowed to be)] processed” (A29WP 2003, 10).

  16. On “securitisation”, i.e. the extension of security’s domain beyond national security (e.g. military security), to domains such as the environment, the economy, society, etc. see Buzan et al. (1998); on the “risk society”, i.e. a society structured around risk, see Beck (1992), Giddens (2003).

  17. It is in the biometrics industry’s interest to disassociate its wares from ideas of risk and insecurity. Associating products intended for all with the identification of a disreputable minority (“threats”, “risks”) makes no business sense.

  18. This position is associated with Kant’s Categorical Imperative in the “formula of humanity” (Kant 2012 [1785]).

  19. Lying involves a certain sort of speech act. Spoofing (generally) does not.

  20. Given the controversy, Kant’s position should ideally be garnered from close reading of his many works. Given that we do not join the debate here, we direct the reader merely to the Groundwork (Kant 2012 [1785]), Lectures on Ethics (Kant 2001 [1775–1780]), and “On a Supposed Right to Lie from Altruistic Motives” (Kant 1976 [1797]). A flavour of the interpretative debate might be gleaned from, inter alia, Korsgaard (1986).

  21. Those sympathetic to Kantian ethics tend to take Kant as having misinterpreted his own commitments regarding lying; those unsympathetic to Kant take his position on lying as indicative of shortcomings in his approach to moral philosophy more generally (Korsgaard 1986, 2–3).

  22. A positive case for political lying (in certain circumstances) is made by Newey (1997).

  23. In the three cases discussed, the word “legitimate” is used in a broad sense, according to which a legitimate authority is an authority whose position is both legally and morally justifiable. Thus (as in the third case) an authority may be considered illegitimate on moral grounds, even if it attained its position through legal (e.g. democratic) channels.

  24. Hence promoting biometrics through incentivising schemes (e.g. ensuring shorter queues for biometric gates at airports) is acceptable only if users are appropriately well-informed.

  25. On system security we may at least say this: professional spoofers looking for vulnerabilities will probably discover them whether or not they are openly revealed. But in any case there are design options (e.g. Kerckhoffs’ Principle) which can minimise the dangers.

  26. According to the data protection principle of fairness, data subjects should be aware of the collection and use of their biometric data. They should also be adequately informed about key elements of the processing, such as the identity of a “data controller”, the purpose and duration of the processing, the type of data, their rights to access, modify or cancel their data, their right to withdraw consent to its collection or processing, and information about the recipients to whom the data are disclosed. See Art. 6 and Art. 10 of Directive 95/46/EC (EU 1995).

References

  • A29WP. (2003). Working document on biometrics. Article 29 data protection working party, 12168/02/EN WP 80. At http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2003/wp80_en.pdf. Accessed September 5, 2012.

  • A29WP. (2012). Opinion 3/2012 on developments in biometric technologies. Article 29 data protection working party, 00720/12/EN, WP193. At http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2012/wp193_en.pdf. Accessed September 5, 2012.

  • Beck, U. (1992). Risk society: Towards a new modernity. M. Ritter (trans.) London: Sage.

  • Buzan, B., Weaver, O., & de Wilde, J. (1998). Security: A new framework for analysis. London: Lynne Rienner.

    Google Scholar 

  • Deravi, F., Fairhurst, M., Guest, R., Mavity, N., & Canuto, A. (2003). Intelligent agents for the management of complexity in multimodal biometrics. International Journal of Universal Access in the Information Society, 2(4), 239–304.

    Google Scholar 

  • EC. (2012). Proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 final. At http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0011:FIN:EN:PDF. Accessed September 5, 2012.

  • EU. (1995). Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. Official Journal of the European Communities, L281, 31–50.

    Google Scholar 

  • EU. (2000). Charter of fundamental rights of the European Union. Official Journal of the European Communities, 2000/C 364, 1–22.

    Google Scholar 

  • Garfinkel, S. (2001). Database nation: The death of privacy in the 21st century. Cambridge, MA: O’Reilly.

    Google Scholar 

  • Giddens, A. (2003). Runaway world: How globalization is shaping our lives. London: Routledge.

    Google Scholar 

  • Jain, A. K., Flynn, P., & Ross, A. A. (Eds.). (2008). Handbook of biometrics. New York: Springer.

    Google Scholar 

  • Jain, A. K., Ross, A. A., & Nandakumar, K. (2011). (Eds.) Introduction to biometrics. New York: Springer.

  • Kant, I. (1976 [1797]). On a supposed right to lie from altruistic motives. In his Critique of practical reason and other writings in moral philosophy. New York: Garland.

  • Kant, I. (2001 [1775–1780]). In P. Heath & J. B. Schneewind (Eds.), Lectures on ethics. Cambridge: Cambridge University Press.

  • Kant, I. (2012 [1785]). In M. Gregor & J. Timmermann (Eds.), Groundwork of the metaphysics of morals. Revised edition. Cambridge: Cambridge University Press.

  • Korsgaard, C. M. (1986). The right to lie: Kant on dealing with evil. Philosophy & Public Affairs, 15(4), 325–349.

    Google Scholar 

  • Lummis, R. C., & Rosenberg, A. E. (1972). Test of an automatic speaker verification method with intensively trained mimics (A). Journal of Acoustic Society of America, 51(1A), 131–132.

    Google Scholar 

  • Lyon, D. (2001). Surveillance society: Monitoring everyday life. Buckingham: Open University Press.

    Google Scholar 

  • Mahon, J. E. (2003). Kant on lies, candour and reticence. Kantian Review, 7, 102–133.

    Article  Google Scholar 

  • Matsumoto, T., Matsumoto, H., Yamada, K. & Hoshino, S. (2002). Impact of artificial gummy fingers on fingerprint systems. In Proceedings of SPIE, optical security and counterfeit deterrence techniques IV, p. 8577.

  • Mordini, E., & Rebera, A. P. (2012). No identification without representation: constraints on the use of biometric identification systems. Review of Policy Research, 29(1), 5–20.

    Article  Google Scholar 

  • Mordini, E., & Tzovaras, D. (Eds.). (2012). Second generation biometrics: The ethical, legal and social context. Dordrecht: Springer.

    Google Scholar 

  • Newey, G. (1997). Political lying: A defense. Public Affairs Quarterly, 11(2), 93–116.

    Google Scholar 

  • Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Stanford, CA.: Stanford University Press.

    Google Scholar 

  • Nixon, K. A., Aimale, V., & Rowe, R. K. (2007). Spoof detection schemes. In A. K. Jain, P. Flynn, & A. A. Ross (Eds.), Handbook of biometrics (pp. 403–423). New York: Springer.

    Google Scholar 

  • Pan, G., Wu, Z., & Sun, L. (2008). Liveness detection for face recognition. In K. Delac, M. Grgic, & M. S. Bartlett (Eds), Recent advances in face recognition (pp. 109–124). Vienna: IN-TECH.

  • Van der Putte, T., & Keuning, J. (2000). Biometrical fingerprint recognition: don’t get your fingers burned. In Proceedings of conference on smart card research and advanced applications (CARDIS, 2000) (pp 289–303).

  • Wehde, A., & Beffel, J. N. (1924). Finger-prints can be forged. Chicago: Tremonia Publish Co.

    Google Scholar 

  • Wei, Z., Qiu, X., Sun, Z., & Tan, T. (2008). Counterfeit iris detection based on texture analysis. In 19th International conference on pattern recognition, IEEE, 1–4.

Download references

Acknowledgments

This work has been partly funded by two grants from the European Commission: TABULA RASA—Trusted Biometrics under Spoofing Attacks (Grant Agreement no. 257289); and SAPIENT—Supporting Fundamental Rights, Privacy and Ethics in Surveillance Technologies (Grant Agreement no. 261698).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew P. Rebera.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rebera, A.P., Bonfanti, M.E. & Venier, S. Societal and Ethical Implications of Anti-Spoofing Technologies in Biometrics. Sci Eng Ethics 20, 155–169 (2014). https://doi.org/10.1007/s11948-013-9440-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11948-013-9440-9

Keywords

Navigation