Human Control of Technology (67)

View options:

Find narratives by ethical themes or by technologies.

FILTERreset filters
Themes
  • Privacy
  • Accountability
  • Transparency and Explainability
  • Human Control of Technology
  • Professional Responsibility
  • Promotion of Human Values
  • Fairness and Non-discrimination
Show more themes
Technologies
  • AI
  • Big Data
  • Bioinformatics
  • Blockchain
  • Immersive Technology
Show more technologies
Additional Filters:
  • Media Type
  • Availability
  • Year
    • 1916 - 1966
    • 1968 - 2018
    • 2019 - 2069
  • Duration
  • 12 min
  • Kinolab
  • 1965
image description
Supercomputer Rule and Condensing Human Behavior

The city of Alphaville is under the complete rule of Alpha-60, an omnipresent robot whose knowledge is more vast than that of any human. This robot, whose learning and knowledge model is deemed “too complex for human understanding,” cements its rule through effectively outlawing emotion in Alphaville, with all definitions of consciousness centering on rationality. All words expressing curiosity or emotion are erased from human access, with asking “why” being replaced with saying “because.” Lemmy is a detective who has entered Alphaville from an external land to destroy Alpha-60. However, in their conversation, Alpha-60 is immediately able to suss out the suspicious aspects of Lemmy’s visit and character.

  • Kinolab
  • 1965
  • 9 min
  • Kinolab
  • 1995
image description
Self-Sustaining Programs

In this world, a human consciousness (“ghost”) can inhabit an artificial body (“shell”), thus at once becoming edited humans in a somewhat robotic body.  The Puppet Master, a notorious villain in this world, is revealed not to be a human hacker, but a computer program which has gained sentience and gone on to hack the captured shell. It challenges the law enforcement officials of Section 6 and Section 9 saying that it is a life-form and not an AI. It argues that its existence as a self-sustaining program which has achieved singularity is not different from human DNA as a “self-sustaining program.” The Puppet Master specifically references reproduction/offspring, not copying, as a distinguishing feature of living things as opposed to nonliving things. Additionally, it developed emotional connection with Major which led it to select her as a candidate for merging. It references how it can die but live on through the merging and, after Major’s death, in the internet.

  • Kinolab
  • 1995
  • 12 min
  • Kinolab
  • 1973
image description
Simulated Humans and Virtual Realities

Simulacron is a virtual reality full of 10,000 simulated humans who believe themselves to be sentient, but are actually nothing more than programs. The identity units in Simulacron do not know or understand that they are artificial beings, and they behave under the idea that they are real humans. “Real” humans can enter this virtual reality through a brain-computer interface, and control the virtual identity units. Christopher Nobody, a suspect whom Fred is trying to track down, had the revelation that he was an identity unit, and that realization led to a mental breakdown. In following this case, Fred meets Einstein, a virtual unit who desires to join the real world. As Einstein enacts the final stages of this plan, Fred discovers a shocking secret about his own identity. For a similar concept, see the narrative “Online Dating Algorithms” on the Hang the DJ episode of Black Mirror. 

  • Kinolab
  • 1973
  • 7 min
  • ZDNet
  • 2020
image description
Rebooting AI: Deep learning, meet knowledge graphs

Dr. Gary Marcus explains that deep machine learning as it currently exists is not maximizing the potential of AI to collect and process knowledge. He essentially argues that these machine “brains” should have more innate knowledge than they do, similar to how animal brains function in processing an environment. Ideally, this sort of baseline knowledge would be used to collect and process information from “Knowledge graphs,” a semantic web of information available on the internet which can sometimes be hard for an AI to process without translation to machine vocabularies such as RDF.

  • ZDNet
  • 2020
  • 4 min
  • VentureBeat
  • 2020
image description
Researchers Find that Even Fair Hiring Algorithms Can Be Biased

A study on the engine of TaskRabbit, an app which uses an algorithm to recommend the best workers for a specific task, demonstrates that even algorithms which attempt to account for fairness and parity in representation can fail to provide what they promise depending on different contexts.

  • VentureBeat
  • 2020
  • 7 min
  • MIT Technology Review
  • 2020
image description
Tiny four-bit computers are now all you need to train AI

This article details a new approach emerging in AI science; instead of using 16 bits to represent pieces of data which train an algorithm, a logarithmic scale can be used to reduce this number to four, which is more efficient in terms of time and energy. This may allow machine learning algorithms to be trained on smartphones, enhancing user privacy. Otherwise, this may not change much in the AI landscape, especially in terms of helping machine learning reach new horizons.

  • MIT Technology Review
  • 2020
Load more