The human consciousness leaving our bodily form in order to move beyond the human lifespan.
Digital Immortality (29)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 15 min
- 2024
In this piece, Leong—a Catholic attorney and theology graduate student—explores the ethical, spiritual, and emotional implications of “grief tech,” particularly AI-powered “ghostbots” that simulate conversations with deceased loved ones. She critiques this technology through a Christian theological lens, drawing on thinkers like Karl Rahner and Tina Beattie to argue that such digital recreations undermine the embodied nature of human personhood and the Christian understanding of death.
- 2024
The false promise of keeping a loved one “alive” with A.I. grief bots.
In this piece, Leong—a Catholic attorney and theology graduate student—explores the ethical, spiritual, and emotional implications of “grief tech,” particularly AI-powered “ghostbots” that simulate conversations with deceased loved ones. She critiques this technology through a Christian theological lens, drawing on thinkers like Karl Rahner and Tina Beattie to argue that such digital recreations undermine the embodied nature of human personhood and the Christian understanding of death.
- What does the article suggest about the meaning of personhood, and how might AI griefbots distort this concept?
- How might someone from a different religious or cultural tradition respond differently to the idea of digitally resurrecting the dead?
-
- 10 min
- Daily Mail
- 2024
Cites a study from Cambridge University that discusses potential ways in which grief bots may be exploitative. It establishes that grief bots influence you because they establish a connection through the identity and reputation of a loved one and then impact a user’s decisions. Although the article accepts that a grief bot may be therapeutic in some cases, users may be coerced into buying something by the grief bot. The grief bot can become confused with its role, for example, if a terminally ill woman leaves a grief bot for her child, the bot might depict an impending in-person encounter with the child. The third scenario in the article is one of a dying parent secretly subscribing to a grief bot service before his death, and the maintenance of the grief bot becomes intense emotional labour for the children of the deceased.
- Daily Mail
- 2024
-
- 10 min
- Daily Mail
- 2024
Think twice before using AI to digitally resurrect a dead loved one: So-called ‘griefbots’ could HAUNT you, Cambridge scientists warn
Cites a study from Cambridge University that discusses potential ways in which grief bots may be exploitative. It establishes that grief bots influence you because they establish a connection through the identity and reputation of a loved one and then impact a user’s decisions. Although the article accepts that a grief bot may be therapeutic in some cases, users may be coerced into buying something by the grief bot. The grief bot can become confused with its role, for example, if a terminally ill woman leaves a grief bot for her child, the bot might depict an impending in-person encounter with the child. The third scenario in the article is one of a dying parent secretly subscribing to a grief bot service before his death, and the maintenance of the grief bot becomes intense emotional labour for the children of the deceased.
- What are possible ways that rituals that could be used to retire grief bots?
- Should a grief bot be making any recommendations to the user? What are the potential problems or harms that could be caused by these recommendations?
-
- 50 min
- Science and Engineering Ethics
- 2022
Lindemann identifies grief bots as techno-social niches that change the affective emotional state of the user. With a focus on the dignity of the bereaved rather than the deceased, Lindemann argues that grief bots can both regulate and deregulate users’ emotions. Referring to them as pseudo-bonds, Lindemann does a very good job of trying to characterize a standard relationship with a grief bot. This article is mostly about the grief and well-being of users of griefbots.
- Science and Engineering Ethics
- 2022
-
- 50 min
- Science and Engineering Ethics
- 2022
The Ethics of ‘Deathbots’
Lindemann identifies grief bots as techno-social niches that change the affective emotional state of the user. With a focus on the dignity of the bereaved rather than the deceased, Lindemann argues that grief bots can both regulate and deregulate users’ emotions. Referring to them as pseudo-bonds, Lindemann does a very good job of trying to characterize a standard relationship with a grief bot. This article is mostly about the grief and well-being of users of griefbots.
- What does Lindemann mean by internet-enabled techno-social niches, and what things exemplify them?
- After reading this paper, would you ever use–or allow your digital remains to create a deathbot? Why or why not?
- Outline the key data-protection and safety requirements you would test in a pilot program before approving any clinical deployment of grief bots.
-
- 9 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
- Kinolab
- 2013
Martha and Ash Part II: Digital Revival and Human Likeness in Hardware
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
How can memories be kept pure when robots are able to impersonate deceased loved ones? If programs and robots such as this can be created, do we truly own our own existence? How can artificial intelligence fail as therapy or companionship? Can artificial intelligence and robotics help comfort people who never even met the deceased? How should an artificial companion be handled by its administrator? Can an animated or robotic humanoid likeness of a person who seemingly has feelings be relegated to the attic as easily as other mementos can?
-
- 3 min
- CNN
- 2021
The prominence of social data on any given person afforded by digital artifacts, such as social media posts and text messages, can be used to train a new algorithm patented by Microsoft to create a chatbot meant to imitate that specific person. This technology has not been released, however, due to its harrowing ethical implications of impersonation and dissonance. For the Black Mirror episode referenced in the article, see the narratives “Martha and Ash Parts I and II.”
- CNN
- 2021
-
- 3 min
- CNN
- 2021
Microsoft patented a chatbot that would let you talk to dead people. It was too disturbing for production
The prominence of social data on any given person afforded by digital artifacts, such as social media posts and text messages, can be used to train a new algorithm patented by Microsoft to create a chatbot meant to imitate that specific person. This technology has not been released, however, due to its harrowing ethical implications of impersonation and dissonance. For the Black Mirror episode referenced in the article, see the narratives “Martha and Ash Parts I and II.”
How do humans control their identity when it can be replicated through machine learning? What sorts of quirks and mannerisms are unique to humans and cannot be replicated by an algorithm?
-
- 5 min
- Wired
- 2020
As means of preserving deceased loved ones digitally become more and more likely, it is critical to consider the implications of technologies which aim to replicate and capture the personality and traits of those who have passed. Not only might this change the natural process of grieving and healing, it may also have alarming consequences for the agency of the dead. For the corresponding Black Mirror episode discussed in the article, see the narratives “Martha and Ash Parts I and II.”
- Wired
- 2020
-
- 5 min
- Wired
- 2020
The Ethics of Rebooting the Dead
As means of preserving deceased loved ones digitally become more and more likely, it is critical to consider the implications of technologies which aim to replicate and capture the personality and traits of those who have passed. Not only might this change the natural process of grieving and healing, it may also have alarming consequences for the agency of the dead. For the corresponding Black Mirror episode discussed in the article, see the narratives “Martha and Ash Parts I and II.”
Should anyone be allowed to use digital resurrection technologies if they feel it may better help them cope? With all the data points that exist for internet users in this day and age, is it easier to create versions of deceased people which are uncannily similar to their real identities? What would be missing from this abstraction? How is a person’s identity kept uniform or recognizable if they are digitally resurrected?