All Narratives (356)

View options:

Find narratives by ethical themes or by technologies.

FILTERreset filters
Themes
  • Privacy
  • Accountability
  • Transparency and Explainability
  • Human Control of Technology
  • Professional Responsibility
  • Promotion of Human Values
  • Fairness and Non-discrimination
Show more themes
Technologies
  • AI
  • Big Data
  • Bioinformatics
  • Blockchain
  • Immersive Technology
Show more technologies
Additional Filters:
  • Media Type
  • Availability
  • Year
    • 1916 - 1966
    • 1968 - 2018
    • 2019 - 2069
  • Duration
  • 9 min
  • Kinolab
  • 2002
image description
Trusting Machines and Variable Outcomes

In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens. In this narrative, the PreCrime program is audited, and the officers must explain the ethics and philosophies at play behind their systems. After captain John Anderton is accused of a future crime, he flees, and learns of “minority reports,” or instances of disagreement between the Precogs covered up by the department to make the justice system seem infallible.

  • Kinolab
  • 2002
  • 8 min
  • Kinolab
  • 2016
image description
Maeve Part III: Robot Resistance and Empowerment

Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Lutz helps Maeve in her resistance against tyrannical rule over robots by altering her core code, allowing her to access capabilities previous unavailable to other hosts such as the ability to harm humans and the ability to control other robotic hosts.

  • Kinolab
  • 2016
  • 14 min
  • Kinolab
  • 2016
image description
AI Memories and Self-Identification

Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at the park, recently oversaw an update to add “reveries,” or slight fake memories, into the coding of the robots to make them seem more human. However, members of the board overseeing the park demonstrate that these reveries can sometimes lead robots to remember and “hold grudges” even after they have been asked to erase their own memory, something that can lead to violent tendencies. Later, as Bernard and Theresa snoop on Ford, the director of the park, they learn shocking information, and a robot once again becomes a violent tool as Ford murders Theresa.

  • Kinolab
  • 2016
  • 7 min
  • Kinolab
  • 2016
image description
Relationships and Escapism with AI

Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Dolores, one of these hosts, begins to fall in love with William, a human visitor, and he reciprocates those feelings as he expresses his unhappiness with a planned marriage waiting for him in the real world outside the park. After Dolores is initially angry, she nonetheless rejoins forces with William to search for a place beyond the theme-park Western reality that she has always known.

  • Kinolab
  • 2016
  • 6 min
  • Kinolab
  • 2019
image description
Resisting Realities and Robotic Murder

Eleanor Shellstrop runs a fake afterlife, in which she conducts an experiment to prove that humans with low ethical sensibility can improve themselves. One of the subjects, Simone, is in deep denial upon arriving in this afterlife, and does as she pleases after convincing herself that nothing is real. Elsewhere, another conductor of the experiment, Jason, kills a robot which has been taunting him since the advent of the experiment.

  • Kinolab
  • 2019
  • 3 min
  • Kinolab
  • 2017
image description
Personal Statistics Tracking

Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, she tracks her personal ethical point total with a technology which is compared to a Fitbit. In theory, the more good actions she completes, the higher her score will get. For another narrative on personal ratings/point tracking, see the narratives “Lacie Parts I and II” on the Black Mirror episode “Nosedive.”

  • Kinolab
  • 2017
Load more