Big Data (34)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 10 min
- MIT Technology Review
- 2020
This article explains the ethical warnings of Timnit Gebru against training Natural Language Processing algorithms on large language models developed on sets of textual data from the internet. Not only does this process have a negative environmental impact, it also still does not allow these machine learning tools to process semantic nuance, especially as it relates to burgeoning social movements or countries with lower internet access. Dr. Gebru’s refusal to retract this paper ultimately lead to her dismissal from Google.
- MIT Technology Review
- 2020
-
- 10 min
- MIT Technology Review
- 2020
We read the paper that forced Timnit Gebru out of Google. Here’s what it says.
This article explains the ethical warnings of Timnit Gebru against training Natural Language Processing algorithms on large language models developed on sets of textual data from the internet. Not only does this process have a negative environmental impact, it also still does not allow these machine learning tools to process semantic nuance, especially as it relates to burgeoning social movements or countries with lower internet access. Dr. Gebru’s refusal to retract this paper ultimately lead to her dismissal from Google.
How should models for training NLP algorithms be more closely scrutinized? What sorts of voices are needed at the design table to ensure that the impact of such algorithms are consistent across all populations? Can this ever be achieved?
-
- 5 min
- Inc
- 2021
On International Data Privacy Day, Apple CEO Tim Cook fired shots against Mark Zuckerberg and Facebook’s model of mining user data through platform analytics and web mining to serve up targeted ads to users. By contrast, Cook painted Apple as a privacy oriented company who wants to make technology work for its users by not collecting their data and manipulating them psychologically through advertising.
- Inc
- 2021
-
- 5 min
- Inc
- 2021
Tim Cook May Have Just Ended Facebook
On International Data Privacy Day, Apple CEO Tim Cook fired shots against Mark Zuckerberg and Facebook’s model of mining user data through platform analytics and web mining to serve up targeted ads to users. By contrast, Cook painted Apple as a privacy oriented company who wants to make technology work for its users by not collecting their data and manipulating them psychologically through advertising.
Are you convinced that Apple has a better business model than Facebook? Should users be responsible for taking steps to protect themselves against web mining, or should Facebook be responsible for adding in more guardrails? What are the consequences of both Facebook and Apple products being involved in larger architectures that extend beyond the singular digital artifact?
-
- 3 min
- CNN
- 2021
The prominence of social data on any given person afforded by digital artifacts, such as social media posts and text messages, can be used to train a new algorithm patented by Microsoft to create a chatbot meant to imitate that specific person. This technology has not been released, however, due to its harrowing ethical implications of impersonation and dissonance. For the Black Mirror episode referenced in the article, see the narratives “Martha and Ash Parts I and II.”
- CNN
- 2021
-
- 3 min
- CNN
- 2021
Microsoft patented a chatbot that would let you talk to dead people. It was too disturbing for production
The prominence of social data on any given person afforded by digital artifacts, such as social media posts and text messages, can be used to train a new algorithm patented by Microsoft to create a chatbot meant to imitate that specific person. This technology has not been released, however, due to its harrowing ethical implications of impersonation and dissonance. For the Black Mirror episode referenced in the article, see the narratives “Martha and Ash Parts I and II.”
How do humans control their identity when it can be replicated through machine learning? What sorts of quirks and mannerisms are unique to humans and cannot be replicated by an algorithm?
-
- 7 min
- Wired
- 2021
An anonymous college student created a website titled “Faces of the Riot,” a virtual wall containing over 6,000 face images of insurrectionists present at the riot at the Capitol on January 6th, 2021. The ultimate goal of the creator’s site, which used facial recognition algorithms to crawl through videos posted to the right-wing social media site Parler, is to hopefully have viewers identify any criminals that they recognize to the proper authorities. While the creator put safeguards for privacy in place, such as using “facial detection” rather than “facial recognition”, and their intentions are supposedly positive, some argue that the implications on privacy and the widespread integration of this technique could be negative.
- Wired
- 2021
-
- 7 min
- Wired
- 2021
This Site Published Every Face From Parler’s Capitol Riot Videos
An anonymous college student created a website titled “Faces of the Riot,” a virtual wall containing over 6,000 face images of insurrectionists present at the riot at the Capitol on January 6th, 2021. The ultimate goal of the creator’s site, which used facial recognition algorithms to crawl through videos posted to the right-wing social media site Parler, is to hopefully have viewers identify any criminals that they recognize to the proper authorities. While the creator put safeguards for privacy in place, such as using “facial detection” rather than “facial recognition”, and their intentions are supposedly positive, some argue that the implications on privacy and the widespread integration of this technique could be negative.
Who deserves to be protected from having shameful data about themselves posted publicly to the internet? Should there even be any limits on this? What would happen if a similar website appeared in a less seemingly noble context, such as identifying members of a minority group in a certain area? How could sites like this expand the agency of bad or discriminatory actors?
-
- 5 min
- NPR
- 2020
After the FTC and 48 States charged Facebook with being a monopoly in late 2020, the FTC continues the push for accountability of tech monopolies by demanding that large social network companies, including Facebook, TikTok, and Twitter, share exactly what they do with user data in hopes of increased transparency. Pair with “Facebook hit with antitrust lawsuit from FTC and 48 state attorneys general“
- NPR
- 2020
-
- 5 min
- NPR
- 2020
Amazon, TikTok, Facebook, Others Ordered To Explain What They Do With User Data
After the FTC and 48 States charged Facebook with being a monopoly in late 2020, the FTC continues the push for accountability of tech monopolies by demanding that large social network companies, including Facebook, TikTok, and Twitter, share exactly what they do with user data in hopes of increased transparency. Pair with “Facebook hit with antitrust lawsuit from FTC and 48 state attorneys general“
Do you think that users, especially younger users, would trade their highly-tailored recommender system and social network experiences for data privacy? How much does transparency of tech monopolies help when many people are not fluent in the concept of how algorithms work? Should social media companies release the abstractions of users that it forms using data?
-
- 7 min
- Wired
- 2020
In discussing the history of the singular Internet that many global users experience every day, this article reveals some dangers of digital technologies becoming transparent through repeated use and reliance. Namely, it becomes more difficult to imagine a world where there could be alternatives to the current digital way of doing things.
- Wired
- 2020
-
- 7 min
- Wired
- 2020
Hello, World! It is ‘I’, the Internet
In discussing the history of the singular Internet that many global users experience every day, this article reveals some dangers of digital technologies becoming transparent through repeated use and reliance. Namely, it becomes more difficult to imagine a world where there could be alternatives to the current digital way of doing things.
Is it too late to imagine alternatives to the Internet? How could people be convinced to get on board with a radical redo of the internet as we know it? Do alternatives need to be imagined before forming a certain digital product or service, especially if they end up being as revolutionary as the internet? Are the most popular and powerful digital technologies and services “tools”, or have they reached the status of cultural norms and conduits?