Skip to main content

Advertisement

Log in

Smile, you are being identified! Risks and measures for the use of facial recognition in (semi-)public spaces

  • Original Research
  • Published:
AI and Ethics Aims and scope Submit manuscript

Abstract

This article analyses the use of facial recognition technology (FRT) in (semi-)public spaces with a focus in the Brazilian context. Therefore, the operation of the FRT processing chain is addressed, as well as the juridical nature of the facial signature, focusing mainly in the Brazilian data protection framework. FRT has been used in everyday life for several purposes, such as security, digital ranking, targeted marketing and health protection. However, the indiscriminate use of FRT poses high risks to privacy and data protection. In this perspective, to avoid harms such as inaccuracy, normalisation of cyber-surveillance and lack of transparency, safeguards were identified to guarantee individual rights, such as soft law, oversight, international standards and regulatory sandboxes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

(Source: EFF, adapted)

Fig. 2

Similar content being viewed by others

Explore related subjects

Discover the latest articles and news from researchers in related subjects, suggested using machine learning.

Notes

  1. In this article, we use the sentence “(semi-)public” to represent the two types of spaces, one of public nature and other of a private nature. Where it is necessary to distinguish between them, we use the term sensu stricto to identify the former.

  2. For an English version of the LGPD, refer to https://iapp.org/media/pdf/resource_center/Brazilian_General_Data_Protection_Law.pdf.

  3. LGPD, Art 5, I.

  4. Article 4. This Law does not apply to the processing of personal data:

    III—Carried out for the sole purpose of:

    (a) public security;

    (b) national defence;

    (c) State security;

    § 1 The processing of personal data provided for in Item III shall be governed by specific legislation, which shall provide for proportionate and strictly necessary measures to meet the public interest, observed due process, the general principles of protection and the rights of the data subject provided for in this Law.

  5. Law 7.102/73, Art. 10. Activities carried out by way of provision of services shall be considered to be private security for the purpose of:

    (I) the financial monitoring of financial institutions and other public or private establishments and the security of persons;

    (II) carry out the transport of securities or ensure the carriage of any other type of cargo.

  6. LGPD, Art. 5, VI—Controller: natural or legal person, whether governed by public or private law, to whom decisions relating to the processing of personal data are responsible; (free translation).

  7. Article 11. Sensitive personal data may only be processed in the following cases: […]

    (II) without the holder’s consent, in cases where it is indispensable for:

    (a) compliance with legal or regulatory obligations by the controller;

    (b) sharing of data necessary for the execution by the public administration of public policies provided for by laws or regulations;

    (c) carrying out research by means of searching by means of ensuring, where possible, anonymisation of sensitive personal data;

    (d) regular exercise of rights, including contract and judicial, administrative and arbitral proceedings, the latter pursuant to Law No 9.307 of 23 September 1996 (Arbitration Act);

    (e) protection of the life or physical age of the holder or of a third party;

    (f) health protection, which is carried out exclusively by healthcare professionals, health services or health authorities; or

    (g) ensuring the prevention of fraud and the security of the holder, the procedures for identifying and authenticating records in electronic systems, retaining the rights referred to in Article 9 of this Law and except in the case of fundamental rights and freedoms of the data subject which require the protection of personal data. (free translation).

  8. Further details of this Directive are mentioned in section IV.C.

  9. A match occurs when the artificial intelligence system is compatible between an image captured by the camera and another image contained in a given data bank.

  10. These safeguards were based on the case-law in the context of surveillance of the European Court of Justice and the European Court of Human Rights. See more in MORAES, T. G.S. Spark of Light in the Going Dark: Legal Safeguards for Law Enforcement’s Encryption Circumvention Measures. 2019. Master Thesis in the Law and Technology LLM Program, Tilburg (NL)—Jun/2019.

References

  1. Peterson, M.: Living with difference in hyper-diverse areas: how important are encounters in semi-public spaces? https://www.tandfonline.com/doi/full/10.1080/14649365.2016.1210667 (2016). Accessed 20 Jan 2020

  2. Sabbagh, D.: Facial recognition row: police gave King’s Cross owner images of seven people’ The Guardian. https://www.theguardian.com/technology/2019/oct/04/facial-recognition-row-police-gave-kings-cross-owner-images-seven-people (2019). Accessed 20 Jan 2020

  3. Lisboa, V.: Câmeras de reconhecimento facial levam a 4 prisões no carnaval do Rio’ Agència Brasil. http://agenciabrasil.ebc.com.br/geral/noticia/2019-03/cameras-de-reconhecimento-facial-levam-4-prisoes-no-carnaval-do-rio (2019). Accessed 20 Jan 2020

  4. Lobato, L., et al.: Videomonitoramento: mais câmeras, mais segurança? https://igarape.org.br/videomonitoramento-webreport/ (2020). Accessed 22 Jun 2020

  5. Metrô: Metrô compra sistema de monitoramento eletrônico com reconhecimento facial. http://www.metro.sp.gov.br/noticias/28-06-2019-metro-compra-sistema-de-monitoramento-eletronico-com-reconhecimento-facial.fss (2019). Accessed 20 Jan 2020

  6. Chivers, T.: Facial recognition… coming to a supermarket near you’ The Guardian. https://www.theguardian.com/technology/2019/aug/04/facial-recognition-supermarket-facewatch-ai-artificial-intelligence-civil-liberties (2019). Accessed 20 Jan 2020

  7. Mann, M., Smith, M.: Automated facial recognition technology: recent developments and approaches to oversight. UNSW Law J. 40(1), 121 (2017)

    Google Scholar 

  8. Güven, K.: Facial recognition technology: lawfulness of processing under the GDPR in employment, digital signage and retail context. (Master Thesis, Tilburg University 2019) (2019)

  9. Garvais, J.: How facial recognition works? https://us.norton.com/internetsecurity-iot-how-facial-recognition-software-works.html (2018). Accessed 20 Jan 2020

  10. Article 29 Data Protection Working Party—Art29WP: Opinion 02/2012 on facial recognition in online and mobile services. https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp192_en.pdf (2012a). Accessed 22 Sept 2020

  11. Article 29 Data Protection Working Party—Art29WP: Opinion 03/2012 on developments in biometric technologies. https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp193_en.pdf (2012b). Accessed 22 Sept 2020

  12. EDPB: Guidelines 3/2019 on processing of personal data through video devices. https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_201903_video_devices_en_0.pdf (2020). Accessed 20 Jan 2020

  13. Brazil: Constitution of the Federative Republic of Brazil of 1988. http://www.planalto.gov.br/ccivil_03/Constituicao/ (1988). Accessed 20 Jan 2020

  14. Instituto Brasileiro de Defesa do Consumidor—IDEC: Carta Idec nº 30/2019/Coex. https://idec.org.br/sites/default/files/carta_idec_coex.pdf (2019). Accessed 20 Feb 2020

  15. Paiva, A., et al.: Novas ferramentas, velhas práticas: reconhecimento facial e policiamento no Brasil. http://observatorioseguranca.com.br/wp-content/uploads/2019/11/1relatoriorede.pdf (2019). Accessed 20 Jan 2020

  16. Ma, A.: China has started ranking citizens with a creepy “social credit” system—here’s what you can do wrong, and the embarrassing, demeaning ways they can punish you’ Business Insider. https://www.businessinsider.com/china-social-credit-system-punishments-and-rewards-explained-2018-4 (2018). Accessed 20 Jan 2020

  17. Lakshmanan, R.: India is going ahead with its facial recognition program despite privacy concerns The Next Web. https://thenextweb.com/security/2019/11/11/india-goes-ahead-with-its-facial-recognition-program-despite-privacy-concerns/ (2019). Accessed 20 Jan 2020

  18. Brazil, Law 7.102, of 20 June 1973: http://www.planalto.gov.br/ccivil_03/LEIS/L7102.htm (2019). Accessed 22 Jun 2020

  19. Wiewiórowski, W.: Facial recognition: a solution in search of a problem?’ https://edps.europa.eu/node/5551 (2019). Accessed 20 Jan 2020

  20. Instituto Brasileiro de Defesa do Consumidor: Após denúncia do Idec, Hering é condenada por uso de reconhecimento facial. https://idec.org.br/noticia/apos-denuncia-do-idec-hering-e-condenada-por-uso-de-reconhecimento-facial (2019). Accessed 4 Sep 2020

  21. Dekkers, D.: Privacy or security?—Function Creep’ kills your privacy’ Digidentity. https://www.digidentity.eu/en/article/Function-creep-kills-your-privacy/ (2016). Accessed 20 Jan 2020

  22. Kuo, L.: The new normal: China’s excessive coronavirus public monitoring could be here to stay’ The Guardian. https://www.theguardian.com/world/2020/mar/09/the-new-normal-chinas-excessive-coronavirus-public-monitoring-could-be-here-to-stay (2020). Accessed 20 Jan 2020

  23. Tétrault-Farber: Moscow deploys facial recognition technology for coronavirus quarantine’ Thomson Reuters. https://uk.reuters.com/article/us-china-health-moscow-technology/moscow-deploys-facial-recognition-technology-for-coronavirus-quarantine-idUKKBN20F1RZ (2020)

  24. OECD: Tracking and tracing COVID: Protecting privacy and data while using apps and biometrics. https://www.oecd.org/coronavirus/policy-responses/tracking-and-tracing-covid-protecting-privacy-and-data-while-using-apps-and-biometrics/ (2020). Accessed 20 Jun 2020

  25. European Commission: Commission Recommendation (EU) 2020/518 of 8 April 2020 on a common Union toolbox for the use of technology and data to combat and exit from the COVID-19 crisis, in particular concerning mobile applications and the use of anonymised mobility data. https://eur-lex.europa.eu/eli/reco/2020/518/oj (2020). Accessed 20 Jun 2020

  26. Wright, E.: The future of facial recognition is not fully known: developing privacy and security regulatory mechanisms for facial recognition in the retail sector. Fordham Intell. Property Media Entertainment Law J. 29(2), 611–686 (2019)

    Google Scholar 

  27. G1: The facial recognition system of the RJ’s PM, and women is mistakenly detained. https://g1.globo.com/rj/rio-de-janeiro/noticia/2019/07/11/sistema-de-reconhecimento-facial-da-pm-do-rj-falha-e-mulher-e-detida-por-engano.ghtml (2019). Accessed 03 Sep 2020

  28. European Union Agency of Fundamental Rights—FRA: Facial recognition technology: fundamental rights considerations in the context of law enforcement. https://fra.europa.eu/en/publication/2019/facial-recognition (2019). Accessed 20 Jan 2020

  29. Galic, M., Timan, T., Koops, B.: Bentham, deleuze and beyond: an overview of surveillance theories from the panopticon to participation. Philos. Technol. 30, 9–37 (2017)

    Article  Google Scholar 

  30. Article 29 Data Protection Working Party—Art29WP: Working document on biometrics https://iapp.org/media/pdf/resource_center/wp80_biometrics_08-2003.pdf (2003). Accessed 20 Jan 2020

  31. Howe, N., Strauss, W.: Millennials rising: the next great generation. Knopf Doubleday Publishing Group, New York (2009)

    Google Scholar 

  32. Auxier, B., et al.: Americans and privacy: concerned, confused and feeling lack of control over their personal information. https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/ (2019). Accessed 20 Jan 2020

  33. Fulton, J., Kibby, M.: Millennials and the normalization of surveillance on Facebook. J. Media Cult. Stud. 31(2), 189–199 (2017)

    Article  Google Scholar 

  34. Pinho, M.: Governo anuncia R$ 10 mi em bolsas de estudos para combate ao crime. https://noticias.r7.com/brasil/governo-anuncia-r-10-mi-em-bolsas-de-estudos-para-combate-ao-crime-08012020 (2020). Accessed 20 Jan 2020

  35. Sloan, R., Warner, R.: Algorithms and human freedom. Santa Clara High Technol. Law J. 35, 4 (2019)

    Google Scholar 

  36. Solove, D.: Data mining and the security–liberty debate. Univ. Chicago Law Rev. 75(1), 343–361 (2008)

    Google Scholar 

  37. Ferguson, A.G.: Big data and predictive reasonable suspicion. Univ. Pennsylvania Law Rev. 163(2), 327 (2015)

    Google Scholar 

  38. National Institute of Standards and Technology—NIST: Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects’ https://doi.org/10.6028/NIST.IR.8280 (2019). Accessed 20 Jan 2020

  39. Hagemann, R., Huddleston, J., Thierer, A.: Soft law for hard problems: the governance of emerging technologies in an uncertain future. Colorado Technol. Law J. 17(1), 37 (2018)

    Google Scholar 

  40. Thiago Guimarães Moraes: A spark of light in the going dark: legal safeguards for law enforcement’s encryption circumvention measures’ (master’s thesis. Tilburg Univ. 2019, 50 (2019)

    Google Scholar 

  41. Aranha, M. I.: Manual de direito regulatório: fundamentos de direito regulatório, 3rd edn. Laccademia Publishing (2015)

  42. da Mota Alves, F., Vieira,G.A.S.: Sem a ANPD, a LGPD é um problema, não uma solução. https://www.jota.info/paywall?redirect_to=//www.jota.info/opiniao-e-analise/artigos/anpd-lgpd-problema-solucao-06012020 (2020). Accessed 20 Jun 2020

  43. Danish Standards Foundation: A World Built on Standards: A Textbook for Higher Education. https://www.ds.dk/media/px5jhney/a-world-built-on-standards.pdf (2015). Accessed 20 Jan 2020

  44. Chamber of Deputies: The collegial will have 120 days to draw up the preliminary draft, which will then be examined by the Congress. https://www.camara.leg.br/noticias/618483-maia-cria-comissao-de-juristas-para-propor-lei-sobre-uso-de-dados-pessoais-em-investigacoes/ (2019). Accessed 9 Jan 2020

  45. Lee, D: San Francisco is the first US city to ban facial recognition’ BBC. https://www.bbc.com/news/technology-48276660 (2019). Accessed 20 Jan 2020

  46. Zetzsche, D., et al.: Regulating a revolution: from regulatory sandboxes to smart regulation. Fordham J. Corporate Financial Law 2018, 30 (2018)

    Google Scholar 

  47. Information Commissioner’s Office –ICO: The Guide to the Sandbox (beta phase). https://ico.org.uk/for-organisations/the-guide-to-the-sandbox-beta-phase/ (2019). Accessed 20 Jan 2020

  48. International Federation of Accountants—IFAC: Regulatory divergence: costs, risks, impacts. https://www.ifac.org/system/files/publications/files/IFAC-OECD-Regulatory-Divergence.pdf (2018). Accessed 20 Jan 2020

  49. Babuta, A., Oswald, M., Rinik, C.: Machine learning algorithms and police decision-making legal, ethical and regulatory challenges. https://rusi.org/sites/default/files/201809_whr_3-18_machine_learning_algorithms.pdf.pdf (2018). Accessed 20 Jan 2020

  50. Banco Central do Brasil–BCB: Detalhamento da Consulta n. 72. https://www3.bcb.gov.br/audpub/DetalharAudienciaPage?5&pk=321 (2019). Accessed 20 Jan 2020

  51. Comissão de Valores Mobiliários—CVM: Audiência Pública da CVM para criação de ambiente regulatório experimental (sandbox regulatório). http://www.cvm.gov.br/noticias/arquivos/2019/20190828-1.html (2019). Accessed 20 Jan 2020

  52. Tsai, C.: To regulate or not to regulate? A comparison of government responses to peer-to-peer lending among the United States, China, and Taiwan. University of Cincinnati Law Rev (2019)

Download references

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eduarda Costa Almeida.

Ethics declarations

Conflict of interest

All author(s) declare that they have no conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Moraes, T.G., Almeida, E.C. & de Pereira, J.R.L. Smile, you are being identified! Risks and measures for the use of facial recognition in (semi-)public spaces. AI Ethics 1, 159–172 (2021). https://doi.org/10.1007/s43681-020-00014-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s43681-020-00014-3

Keywords