Networking, Capital, and Cloud Computing (60)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 10 min
- New York Times
- 2019
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
- New York Times
- 2019
-
- 10 min
- New York Times
- 2019
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
What are the consequences of employing biased technologies to survey citizens? Who loses agency, and who gains agency?
-
- 8 min
- Kinolab
- 1997
Dr. Ellie Arroway is a scientist who has been chosen to make contact with the first confirmed extraterrestrial life. However, as contact with the start system Vega advances, religious fanatics and other extremist groups prepare for the moment of interaction. This moment in the film juxtaposes how Ellie, an atheist scientist, only looks forward to the scientific progress toward extraterrestrial contact while others, including the religious extremists, fear it. In general, this clip explores how technology can have diverse social impact and the hysteria that it can foster as it shatters preconceived notions. Later, one such religious terrorist sabotages the transport of Dr. Drumlin in the new machine through a suicide bombing, killing them both.
- Kinolab
- 1997
Technology Versus Religious Fanaticism
Dr. Ellie Arroway is a scientist who has been chosen to make contact with the first confirmed extraterrestrial life. However, as contact with the start system Vega advances, religious fanatics and other extremist groups prepare for the moment of interaction. This moment in the film juxtaposes how Ellie, an atheist scientist, only looks forward to the scientific progress toward extraterrestrial contact while others, including the religious extremists, fear it. In general, this clip explores how technology can have diverse social impact and the hysteria that it can foster as it shatters preconceived notions. Later, one such religious terrorist sabotages the transport of Dr. Drumlin in the new machine through a suicide bombing, killing them both.
How might technological advancement challenge preconceived notions of the world, especially religious ones? Are science, computer science, and innovation sorts of religions in their own right? In an increasingly networked world, how do extremist enclaves rally together to pose a threat to humanity? How can ignorance be combatted in an age where information can be accessed quickly and technology changes the landscape of society?
-
- 9 min
- Kinolab
- 2002
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens. In this narrative, the PreCrime program is audited, and the officers must explain the ethics and philosophies at play behind their systems. After captain John Anderton is accused of a future crime, he flees, and learns of “minority reports,” or instances of disagreement between the Precogs covered up by the department to make the justice system seem infallible.
- Kinolab
- 2002
Trusting Machines and Variable Outcomes
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens. In this narrative, the PreCrime program is audited, and the officers must explain the ethics and philosophies at play behind their systems. After captain John Anderton is accused of a future crime, he flees, and learns of “minority reports,” or instances of disagreement between the Precogs covered up by the department to make the justice system seem infallible.
What are the problems with taking the results of computer algorithms as infallible or entirely objective? How are such systems prone to bias, especially when two different algorithms might make two different predictions? Is there any way that algorithms could possibly make the justice system more fair? How might humans inflect the results of a predictive crime algorithm in order to serve themselves? Does technology, especially an algorithm such as a crime predictor, need to be made more transparent to its users and the general public so that people do not trust it with a religious sort of fervor?
-
- 9 min
- Kinolab
- 2016
Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, the demons of the Bad Place try to wrest Eleanor’s soul away from the Good Place by convincing her that this is where she truly belongs. This resonates with Eleanor, who was always a lone wolf and never found a community of people who she liked. Ultimately, though, she fights to stay in the Good Place because of the fondness she has for the community of people who she knows there.
- Kinolab
- 2016
Community and Belonging
Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, the demons of the Bad Place try to wrest Eleanor’s soul away from the Good Place by convincing her that this is where she truly belongs. This resonates with Eleanor, who was always a lone wolf and never found a community of people who she liked. Ultimately, though, she fights to stay in the Good Place because of the fondness she has for the community of people who she knows there.
Can our desire to be better outweigh our past actions? How do digital technologies help people find communities where they feel they belong? Does the intention to improve as a person matter just as much as actually improving as a person?
-
- 6 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Kate Hatfield, a new mother, discovers that someone has hacked into the device in her head, and thus was able to access some of her lived memories. Later, the culprit of this hack is revealed to be her father-in-law Lawrence, who was attempting to implant the Feed into Bea, the new baby.
- Kinolab
- 2019
Consent and Control with Personal Data
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Kate Hatfield, a new mother, discovers that someone has hacked into the device in her head, and thus was able to access some of her lived memories. Later, the culprit of this hack is revealed to be her father-in-law Lawrence, who was attempting to implant the Feed into Bea, the new baby.
What are the dangers that come with ‘backing up’ memory to some type of cloud account? What risks are posed by hackers and corporations that run such backing up services? Is there something special about the transient, temporary nature of human memory that should remain as it is? How much of our privacy are we willing to sacrifice in order for safety/connectivity? How should consent work in terms of installing a brain-computer interface into a person? Should a parent or other family member be able to decide this for a child?
-
- 2 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Max, a citizen whose Feed was hacked, has to get the device removed from his body as his best friends watch. This procedure includes the removal of some of his memories from both his brain and from the device, although they manage to upload these into a cloud.
- Kinolab
- 2019
Implanted Technology and Disconnection
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Max, a citizen whose Feed was hacked, has to get the device removed from his body as his best friends watch. This procedure includes the removal of some of his memories from both his brain and from the device, although they manage to upload these into a cloud.
What are the risks involved with brain-computer interfaces, especially when we need to ‘remove’ them from our brains? How might this increase medical costs? How can memory and consciousness be ‘backed up’ and ‘uploaded’ back into our bodies using advanced technology?