News Article (145)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 3 min
- CNBC
- 2013
Facial recognition software, or using computer vision and biometric technology on an image of a person to identify them, has potential applications in law enforcement to help catch suspects or criminals. However, aspects of probability are at play, especially as the photos or videos captured become blurrier and need an additional layer of software analysis to be “de-pixelized.” Also, identification depends on the databases to which the FBI has access.
- CNBC
- 2013
-
- 3 min
- CNBC
- 2013
How Facial Recognition Technology Could Help Catch Criminals
Facial recognition software, or using computer vision and biometric technology on an image of a person to identify them, has potential applications in law enforcement to help catch suspects or criminals. However, aspects of probability are at play, especially as the photos or videos captured become blurrier and need an additional layer of software analysis to be “de-pixelized.” Also, identification depends on the databases to which the FBI has access.
How should law enforcement balance training these facial recognition programs with good amounts of quality data and avoiding breaching privacy by accessing more databases with citizen faces? Where can human bias enter into the human-computer systems described in the article? Should there be any margin of error or aspect of probability in technologies that work in volatile areas like law enforcement?
-
- 3 min
- TechCrunch
- 2021
This article presents several case studies of technologies introduced at CES which are specifically designed to help elderly people continue to live independently, mostly using smartphones and internets of things to monitor both the home environment and the physical health of the occupant.
- TechCrunch
- 2021
-
- 3 min
- TechCrunch
- 2021
Startups at CES showed how tech can help elderly people and their caregivers
This article presents several case studies of technologies introduced at CES which are specifically designed to help elderly people continue to live independently, mostly using smartphones and internets of things to monitor both the home environment and the physical health of the occupant.
What implications do these technologies have for the agency of the senior citizens which they are meant to monitor? Does close surveillance truly equate to increased independence? Are there any other downsides or tradeoffs to these technologies?
-
- 5 min
- ABC News
- 2020
The United States government is pushing its interest in breaking up the tech monopoly that is Facebook, hoping to restore some competition in the social networking and data selling market which the company dominates. Facebook, of course, is resistant to these efforts.
- ABC News
- 2020
-
- 5 min
- ABC News
- 2020
Facebook hit with antitrust lawsuit from FTC and 48 state attorneys general
The United States government is pushing its interest in breaking up the tech monopoly that is Facebook, hoping to restore some competition in the social networking and data selling market which the company dominates. Facebook, of course, is resistant to these efforts.
What role did data collection and use play in Facebook’s rise as a monopoly power? What would breaking up this monopoly accomplish? Will users achieve more data privacy if one large company does not own several platforms on which users communicate?
-
- 7 min
- The New Republic
- 2020
The narrative of Dr. Timnit Gebru’s termination from Google is inextricably bound with Google’s irresponsible practices with training data for its machine learning algorithms. Using large data sets to train Natural Language Processing algorithms is ultimately a harmful practice because for all the harms to the environment and biases against certain languages it causes, machines still cannot fully comprehend human language.
- The New Republic
- 2020
-
- 7 min
- The New Republic
- 2020
Who Gets a Say in Our Dystopian Tech Future?
The narrative of Dr. Timnit Gebru’s termination from Google is inextricably bound with Google’s irresponsible practices with training data for its machine learning algorithms. Using large data sets to train Natural Language Processing algorithms is ultimately a harmful practice because for all the harms to the environment and biases against certain languages it causes, machines still cannot fully comprehend human language.
Should machines be trusted to handle and process the incredibly nuanced meaning of human language? How do different understandings of what languages and words mean and represent become harmful when a minority of people are deciding how to train NLP algorithms? How do tech monopolies prevent more diverse voices from entering this conversation?
-
- 5 min
- NPR
- 2020
After the FTC and 48 States charged Facebook with being a monopoly in late 2020, the FTC continues the push for accountability of tech monopolies by demanding that large social network companies, including Facebook, TikTok, and Twitter, share exactly what they do with user data in hopes of increased transparency. Pair with “Facebook hit with antitrust lawsuit from FTC and 48 state attorneys general“
- NPR
- 2020
-
- 5 min
- NPR
- 2020
Amazon, TikTok, Facebook, Others Ordered To Explain What They Do With User Data
After the FTC and 48 States charged Facebook with being a monopoly in late 2020, the FTC continues the push for accountability of tech monopolies by demanding that large social network companies, including Facebook, TikTok, and Twitter, share exactly what they do with user data in hopes of increased transparency. Pair with “Facebook hit with antitrust lawsuit from FTC and 48 state attorneys general“
Do you think that users, especially younger users, would trade their highly-tailored recommender system and social network experiences for data privacy? How much does transparency of tech monopolies help when many people are not fluent in the concept of how algorithms work? Should social media companies release the abstractions of users that it forms using data?
-
- 4 min
- TechCrunch
- 2021
On the day of the January 6th insurrection at the U.S Capitol, social media proved to be a valuable tool for telling the narrative of the horrors taking place within the Capitol building. At the same time, social media plays a large role in political polarization, as users can end up on fringe sites where content is tailored to their beliefs and not always true.
- TechCrunch
- 2021
-
- 4 min
- TechCrunch
- 2021
Social media allowed a shocked nation to watch a coup attempt in real time
On the day of the January 6th insurrection at the U.S Capitol, social media proved to be a valuable tool for telling the narrative of the horrors taking place within the Capitol building. At the same time, social media plays a large role in political polarization, as users can end up on fringe sites where content is tailored to their beliefs and not always true.
How can social media platforms be redesigned or regulated to crack down more harshly on misinformation and extremism? How much can social media be valued as a set of platforms that “help tell the true story of an event” when they also allow mass denial of objective fact? Who should be responsible for shutting down fringe sites, and how should this happen?