Facial recognition technologies are powerful tools that have come to dominate many applications, often serving as ‘gate keepers’ for accessing services, social spaces, and security protocols. They are embedded into many societal surveillance programs (private corporation and governmental) without general public knowledge. It is only recently that legal and ethical guidelines are being created about the ways in which they can be used responsibly, how their training datasets are collected and their quality, and the ways in which their biases may be harming some communities or the larger public. Responsible computer scientists and product development teams should be aware of their responsibilities to ensure fairness, accountability, and transparency when creating applications that use facial recognition technologies.

Goal:

The goal is to learn an overview facial recognition technology and consider multiple perspectives on potential social or ethical issues surrounding its use.

Related Themes and Technologies:

In this module we will read about different perspectives on facial recognition through text narratives and participate in an activity simulating the implementation of facial recognition on a college campus.

Pre-Module

Please complete the following before we meet:

  1. Answer this short concept check, which takes approximately 10 minutes. Then, read and watch the following narratives:

    • 5 min
    • CNN
    • 2010
    image description
    Why face recognition isn’t scary — yet

    Algorithms and machines can struggle with facial recognition, and need ideal source images to perform it consistently. However, its potential use in monitoring and identifying citizens is concerning.

    • 3 min
    • CNBC
    • 2013
    image description
    How Facial Recognition Technology Could Help Catch Criminals

    Facial recognition software, or using computer vision and biometric technology on an image of a person to identify them, has potential applications in law enforcement to help catch suspects or criminals. However, aspects of probability are at play, especially as the photos or videos captured become blurrier and need an additional layer of software analysis to be “de-pixelized.” Also, identification depends on the databases to which the FBI has access.

    • 8 min
    • Kinolab
    • 2016
    image description
    Lacie Part I: Translating Online Interactions and Social Quantification

    In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. Once she is spurred to up her rating, Lacie gets an invite to a high-profile wedding. However, after a few unfortunate events leave her seeming less desirable to others, thus lowering her rating, she finds her world far less accessible and kind. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”

Activity 1

In five groups, please read the following narratives:

  1. Group 1:

    • 2 min
    • azfamily.com
    • 2018
    image description
    Facial recognition technology now used in Phoenix area to locate lost dogs

    Facial recognition technology has found a new application: reuniting dogs with their owners. A simple machine learning algorithm takes a photo of a dog and crawls through a database of photos of dogs in shelters in hopes of finding a match.

    • 5 min
    • Silicon Angle
    • 2019
    image description
    Empathic AI mirrors human emotions to help autistic children

    Artificial Companions assist developmentally disabled kids based on the principle that humans can indeed form emotional connections with nonhuman objects. In fact, it is not exceedingly difficult for robots to read or mirror human emotions, which could have positive implications in workplace or educational settings.

    • 5 min
    • BBC
    • 2021
    image description
    Facial recognition technology meant mum saw dying son

    The ability of facial recognition technology used by the South Wales Police force to identify an individual based on biometric data nearly instantly rather than the previous standard of 10 days allowed a mother to say goodbye to her son on his deathbed. It seems to have other positive impacts, such as identifying criminals earlier than they otherwise might have been. However, as is usually the case, concerns abound about how this facial recognition technology can violate human rights.

  2. Group 2:

    • 3 min
    • techviral
    • 2018
    image description
    New Facial Recognition System Helps Trace 3000 Missing Children In Just 4 Days

    In India, where disappearance of children is a common social issue, facial recognition technology has been useful in identifying and located many missing or displaced children. This breakthrough means that the technology can hopefully be applied to help ameliorate this issue, as well as in other areas such as law enforcement.

    • 5 min
    • The New York Times
    • 2019
    image description
    How Biometrics Makes You Safer

    In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.

    • 5 min
    • New York Times
    • 2020
    image description
    A Case for Facial Recognition

    Decisions on whether or not law enforcement should be trusted with facial recognition are tricky, as is argued by Detroit city official James Tate. On one hand, the combination of the bias latent in the technology itself and the human bias of those who use it sometimes leads to over-policing of certain communities. On the other hand, with the correct guardrails, it can be an effective tool in getting justice in cases of violent crime. This article details the ongoing debate about how much facial recognition technology use is proper in Detroit.

  3. Group 3:

    • 7 min
    • Slate
    • 2019
    image description
    Facebook’s Face-ID Database Could Be the Biggest in the World. Yes, It Should Worry Us.

    Discussion of Facebook’s massive collection of human faces and their potential impact on society.

    • 40 min
    • New York Times Magazine
    • 2021
    image description
    Your Face Is Not Your Own

    This article goes into extraordinary detail on the company Clearview AI, a company whose algorithm has crawled the public web to provide over 3 billion photos of faces with links that travel to the original source of each photo. Discusses the legality and privacy concerns of this technology, how the technology has already been used by law enforcement and in court cases, and the founding of the company. Private use of technology similar to that of Clearview AI could revolutionize society and may move us to the post-privacy era.

    • 40 min
    • New York Times
    • 2021
    image description
    She’s Taking Jeff Bezos to Task

    As facial recognition technology becomes more prominent in everyday life, used by players such as law enforcement officials and private actors to identify faces by comparing them with databases, AI ethicists/experts such as Joy Buolamwini push back against the many forms of bias that these technologies show, specifically racial and gender bias. Governments often use such technologies callously or irresponsibly, and lack of regulation on the private companies which sell these products could lead society into a post-privacy era.

  4. Group 4:

    • 5 min
    • CNET
    • 2019
    image description
    Demonstrators scan public faces in DC to show lack of facial recognition laws

    Fight for the Future, a digital activist group, used Amazon’s Rekognition facial recognition software to scan faces on the street in Washington DC to show that there should be more guardrails on the use of this type of technology, before it is deployed for ends which violate human rights such as identifying peaceful protestors.

    • 7 min
    • Amnesty International
    • 2021
    image description
    AMNESTY INTERNATIONAL CALLS FOR BAN ON THE USE OF FACIAL RECOGNITION TECHNOLOGY FOR MASS SURVEILLANCE

    Amnesty International released a statement detailing its opposition of widespread use of facial recognition technology for mass surveillance purposes based on its misuse and unfair impacts over Black communities and the chilling effect which it would create on peaceful protest.

    • 7 min
    • Slate
    • 2021
    image description
    Maine Now Has the Toughest Facial Recognition Restrictions in the U.S.

    A new law passed unanimously in Maine heavily restricts the contexts in which facial recognition technology can be deployed, putting significant guardrails around how it is used by law enforcement. Also, it allows citizens to sue if they believe the technology has been misused. This is a unique step in a time when several levels of government, all the way up to the federal government, are less likely to attach strict rules to the use of facial recognition technology, despite the clear bias that is seen in the wake of its use.

  5. Group 5:

    • 5 min
    • The Guardian
    • 2019
    image description
    New York tenants fight as landlords embrace Biometrics cameras

    Biometrics technology will be implemented as a means of gaining access to a residential building in Brooklyn, causing pushback among the tenants who prefer to keep their data private, especially considering the lack of legal regulation surrounding the technology. Specifically, there is growing fear that the facial recognition database could be sold to or abused by law enforcement.

    • 10 min
    • New York Times
    • 2019
    image description
    As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias

    Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.

    • 5 min
    • Gizmodo
    • 2021
    image description
    CBP Facial Recognition Scanners Failed to Find a Single Imposter At Airports in 2020

    Customs and Border protection used facial recognition technology to scan travelers entering the U.S at several points of entry in 2020, and did not identify any impostors or impersonators. This is part of a larger program of using biometrics to screen those who enter the country, which raises concerns about data privacy, who may have access to this data, and how it may be used.

Activity 2

In two groups, please participate in the taskforce simulation:

  1. You will now participate in a taskforce simulation. As a group, list all of the benefits or negative consequences you can think of if a college were to implement a facial recognition system to manage its campus life programs. Think of these potential benefits or harms from the perspectives of students, faculty, administrators, alumni, and parents.

     

     

    Things to talk about in your group:

    • What kinds of values and ethical issues were reflected in the narratives that you read in your small group?

    • How did the narratives you read earlier change or shape your feelings about implementing facial recognition in a college/university setting?

    • What recommendations should we provide considering our role of computer science students? What are the resources that are available to us to consider as a part of the expert review process?

    For example:

    (Hint: see Post-Activity for more of these resources.)

Post-Module

Please complete the following after we meet:

  1. Choose one of the following narratives to read. Ask yourself while you are reading:

    • How do these guidelines reflect current values and concerns we have read/talked about so far?

    • Do they talk about whose responsibility it is to make sure apps using facial recognition technologies are created and used in an ethical manner?

    • What is the motivation (explicit/implicit) for this narrative and who is its target audience?

    Narratives:

  2. Please complete this short concept check. Yes, it is the same questions as the one you did earlier.

    • What is our role and our guiding responsibilities for developing ethical technologies as computer science students?

    Here are some resources that should be guiding any product development lifecycle (design, development, deployment, evaluation).

For Instructors - Module Goal and Learning Objectives

The goal of this module is to provide students with an overview of facial recognition technology and and the ethical issues surrounding its use. The module is designed for introductory to intermediate CS courses (i.e., Intro to CS through Algorithms). It follows the CEN format asking students to consider their preexisting ideas and knowledge about the technology (Pre-Activity), consider the history of who created the technology, its original/intended purpose, the impact of its widespread applications (Activity 1), reflect on society level impact from multiple perspectives (technology creator, campus community member, person in society) (Activity 2), and assess existing ethical guidelines from technology organizations (Post-Activity).

  • Students will be able to identify a technology, who designed it, and the purpose it was designed for.

  • Students will be able to identify who the technology was not intended for and who it potentially may harm.

  • Students will be able to articulate how the technology has changed over time and the different purpose it now serves.

  • Students will be able to articulate and discuss possible benefits (such as increased security and ease of identification) and issues (such as harvesting of personal information) with an increase in facial recognition technology.

We have provided examples of assessments you may want to copy and use on your own LMS for secure student data collection. The rationale, format, and time length for the module components are listed below.

Pre-Activity is an online pre-module assignment asking students to respond to a series of questions about what they know about this technology and viewing provided links to narratives.

Activity 1 is a class activity (0.5 hours) that places students into small groups to read and discuss a set of provided narratives to inform them about certain perspectives on facial recognition technology.

Activity 2 is a class activity (1.5 hours) that provides a brief history of facial recognition technology and then divides class into two groups (n=~24 or more with larger class/online class) to evaluate the positive and negative impact of facial recognition technology as it would be applied to their college campus community and services. Groups are given specific roles and perspectives to portray as a part of the role play activity. Homework is assigned to provide additional narratives to consider after the class activity.

Post-Activity is an online post-module assignment that asks students to respond again to a series of questions about what they now know about this technology based on the narratives they viewed or read and the Activity 1 or 2 discussions.

Additional Resources

  • 3 min
image description
Coded Bias: How Ignorance Enters Computer Vision

A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.

  • Vimeo: Shalini Kantayya
  • 2020