Dr. Stephenie Lucas Oney is 75 years old, but she still turns to her father for advice. How did he deal with racism? she wonders. How did he succeed when the odds were against him?
The answers are based on William Lucas's experience as a black man from Harlem who made a living as a police officer, FBI agent, and judge. But Dr. Oney doesn't receive the counseling in person. His father has been dead for more than a year.
Instead, he hears the answers, given in his father's voice, on his phone through HereAfter ai, an artificial intelligence-powered app that generates answers based on hours of interviews conducted with him before. He died in May 2022..
His voice comforts her, but she said she created the profile more for her four children and eight grandchildren.
“I want kids to hear all those things in their voice,” said Dr. Oney, an endocrinologist, from her home in Grosse Pointe, Michigan, “and not for me to try to paraphrase, but for them to hear it from their point of view. of view, his time and his perspective.”
Some people are turning to artificial intelligence technology as a way to communicate with the dead, but its use as part of the grieving process has raised ethical questions and left some who have experimented with it uneasy.
HereAfter ai was introduced in 2019, two years after the debut of StoryFile, which produces interactive videos in which subjects appear to make eye contact, breathe, and blink while answering questions. Both generate answers from the answers users gave to questions like “Tell me about your childhood” and “What is the biggest challenge you faced?”
Its appeal doesn't surprise Mark Sample, a professor of digital studies at Davidson College who teaches a course called Death in the Digital Age.
“Anytime a new form of technology comes out, there's always a need to use it to contact the dead,” Sample said. He noted Thomas Edison's failed attempt to invent a “spiritual phone.”
'My best friend was there'
StoryFile offers a “high fidelity” version in which a historian interviews someone in a studio, but there is also a version that only requires a laptop and webcam to get started. Co-founder Stephen Smith had his mother, Holocaust educator Marina Smith, try it. Your StoryFile avatar He answered questions at his funeral. in July.
According to StoryFile, about 5,000 people have created profiles. Among them was actor Ed Asner, who was interviewed eight weeks before his death in 2021.
The company sent Mr. Asner's StoryFile to his son Matt Asner, who was stunned to see his father looking at him and appearing to answer questions.
“I was blown away,” Matt Asner said. “It was incredible to me how I could have this interaction with my father that was relevant and meaningful, and it was his personality. “This man I really missed, my best friend, was there.”
He played the file at his father's funeral. Some people were moved, she said, but others felt uncomfortable.
“There were people who found it morbid and were scared,” Asner said. “I don't share that opinion,” he added, “but I can understand why they would say that.”
“A little hard to watch”
Lynne Nieto understands it too. She and her husband, Augie, founder of Life Fitness, which makes gym equipment, created a StoryFile before her death in February from amyotrophic lateral sclerosis, or ALS. They thought they could use it on the website. Augie's Search, the nonprofit they founded to raise money for ALS research. Maybe his little grandchildren will want to see him one day.
Nieto saw her file for the first time about six months after her death.
“I'm not going to lie, it was a little hard to watch,” he said, adding that it reminded him of his Saturday morning chats and felt a little “raw.”
These feelings are not uncommon. These products force consumers to confront something they are programmed not to think about: mortality.
“People are apprehensive about death and loss,” James Vlahos, co-founder of HereAfter ai, said in an interview. “It could be a tough sell because people are forced to confront a reality they would rather not interact with.”
HereAfter ai emerged from a chatbot that Vlahos created for his father before his death from lung cancer in 2017. Vlahos, a conversational artificial intelligence specialist and journalist who has contributed to The New York Times Magazine, wrote about the experience for Wired and soon he started hearing from people asking him if he could turn them into a mombot, a spousebot, etc.
“I wasn't thinking about it in any commercial way,” Vlahos said. “And then it became blindly obvious: This should be a business.”
A question of consent and perspective
As with other ai innovations, chatbots created in the likeness of someone who has died raise ethical questions.
Ultimately, it's a question of consent, said Alex Connock, a senior researcher at the University of Oxford's Saïd Business School and author of “The Media Business and artificial intelligence.”
“Like all ethical lines in ai, it will all come down to permission,” he said. “If you've done it knowingly and willingly, I think most of the ethical concerns can be circumvented quite easily.”
The effects on survivors are less clear.
Dr. David Spiegel, associate chair of psychiatry and behavioral sciences at Stanford School of Medicine, said programs like StoryFile and HereAfter ai could help people work through grief, like going through an old photo album.
“The crucial thing is to keep a realistic perspective on what you're examining, which is not that this person is still alive and communicating with you,” he said, “but that you're reviewing what they left behind.”