Civil Surveillance (40)

Video and data surveillance by public and private entities.

View options:

Find narratives by ethical themes or by technologies.

FILTERreset filters
Themes
  • Privacy
  • Accountability
  • Transparency and Explainability
  • Human Control of Technology
  • Professional Responsibility
  • Promotion of Human Values
  • Fairness and Non-discrimination
Show more themes
Technologies
  • AI
  • Big Data
  • Bioinformatics
  • Blockchain
  • Immersive Technology
Show more technologies
Additional Filters:
  • Media Type
  • Availability
  • Year
    • 1916 - 1966
    • 1968 - 2018
    • 2019 - 2069
  • Duration
  • 10 min
  • New York Times
  • 2019
image description
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias

Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.

  • New York Times
  • 2019
  • 3 min
  • techviral
  • 2018
image description
New Facial Recognition System Helps Trace 3000 Missing Children In Just 4 Days

In India, where disappearance of children is a common social issue, facial recognition technology has been useful in identifying and located many missing or displaced children. This breakthrough means that the technology can hopefully be applied to help ameliorate this issue, as well as in other areas such as law enforcement.

  • techviral
  • 2018
  • 5 min
  • Gizmodo
  • 2020
image description
You Need to Opt Out of Amazon Sidewalk

This article describes the new Amazon Sidewalk feature and subsequently explains why users should not buy into this service. Essentially, this feature uses the internet of things created by Amazon devices such as the Echo or Ring camera to create a secondary network connecting nearby homes which also contain these devices, which is sustained by each home “donating” a small amount of broadband. It is explained that this is a dangerous concept because this smaller network may be susceptible to hackers, putting a large number of users at risk.

  • Gizmodo
  • 2020
  • 12 min
  • Wired
  • 2018
image description
How Cops Are Using Algorithms to Predict Crimes

This video offers a basic introduction to the use of machine learning in predictive policing, and how this disproportionately affects low income communities and communities of color.

  • Wired
  • 2018
  • 7 min
  • MIT Tech Review
  • 2020
image description
Why 2020 was a pivotal, contradictory year for facial recognition

This article examines several case studies from the year of 2020 to discuss the widespread usage, and potential for limitation, of facial recognition technology. The author argues that its potential for training and identification using social media platforms in conjunction with its use by law enforcement is dangerous for minority groups and protestors alike.

  • MIT Tech Review
  • 2020
  • 7 min
  • The Verge
  • 2020
image description
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias

PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.

  • The Verge
  • 2020
Load more