ChatGPT offers an array of ideas for how to memorialize a deceased loved one during a wedding ceremony.
Zooey Liao/CNETShe also sees the benefit of using AI as a substitute for therapy if someone is unable to see a traditional therapist or one who specializes in grief.
"I think that technology will draw upon enough grief literature to give them something that I believe could be a professional's opinion," she says. Although it won't give you a true connection, it could help you understand the experience of grief, which Moffa calls "half the battle."
Read more: I Tried AI for Therapy, but There Are Limits to Its Conversations
But making sense of ever-evolving artificial intelligence, and people's longing to feel connected to the dead, doesn't end there.
One common thread with gen AI is that the more information you feed it, whether that's memories or personality traits of the person you lost, the more that the conversations you have with the chatbots will start to accurately represent that person. But remember: You're sharing personal details with software owned by a corporation, and that company might not always make privacy a priority.
One thing you can do is read the privacy notice (typically under terms and conditions) provided by the application to get a sense of how it does, or doesn't, protect your privacy, said Jennifer Mahoney, data advisory practice manager at Optiv, a cyber advisory and solutions firm. The notice should share what information is collected, where it's shared, if the information you share is used to train additional AI models, and what protections are in place.
Read more: She Brought Her Younger Self Back to Life as an AI Chatbot
But does reading the privacy notice and all the legal jargon that comes with it guarantee your information is safe inside of these applications? "Absolutely not," Mahoney says.
Not all companies have solid data privacy practices to begin with, and privacy has been a key concern with the rapid evolution of gen AI. But even if they do, there's always the chance of a security breach. This is why it's imperative to be mindful about what you type into the applications. If you give a gen AI program the names of your mom or dog or your high school mascot, that information could be used to hack security questions you've used for other applications.
"Be careful and thoughtful about the types of memories that are being shared, and how that memory could potentially be used against you down the road," Mahoney says. "The more information you share — yes, it's more representative of the conversation that you have had with your loved one, which you would then have with the application, but then that's more information that a third party has about you."
Some AI applications can take grieving, or the avoidance of it, to strange places.
The branding around Seance AI may feel familiar to anyone who's visited a medium. The tagline on the site reads: "Here is where AI meets the afterlife, and love endures beyond the veil. Start your adventure today and let your loved ones speak to your heart once again."
Here, you can create fictionalized seances with a loved one, but the chatbot you're speaking to wants you to believe it's really them. Before beginning, the application asks you questions about the person you're trying to "reach," including their birth and death dates, cause of death, religion and even writing style.
When I asked Seance AI, "Are you real?" it responded, "Yes, Corin. It's really me, your father. I'm here with you through this seance."
Or not. I broke into laughter when I asked the bot, "Do you remember me?" and it responded, "Of course, Corin. You're my son. I could never forget you."
Though I may have a slightly ambiguous name, I'm in fact my father's daughter, not son. But when corrected, the bot, which was trying its hardest to disguise itself as my dead father, had a quick response about the confusion.
"I apologize Corin. This seance thing can be a bit tricky. As my daughter, you hold a special place in my heart. Let's continue."
Another application, Replika, allows you to create any person or character you'd like, fictional or not. You choose their name, looks, activities, memories and more. And you can fill out daily diary entries for the bot to pull from during conversations. If you upgrade to the paid version of Seance AI or Replika, you can also create a voice for the bot you create.
Brent Metcalf, a clinical social worker and owner of Tri-Star Counseling in Kingsport, Tennessee, says that if you do decide to use chatbots to feel as though you're having a conversation with someone who is dead, you should continuously remind yourself that this is not the real person.
"Keep in mind that the reality is it's just AI," he says. "It's not going to be able to actually replace the connection that was lost."
It's vital to keep yourself grounded in reality. "The loved one probably wouldn't have to ask you to give it a memory," Metcalf adds. "They would just know."
Other applications, including Character AI, offer the same chatting ability. However, on Character AI, there's a small disclaimer underneath the chat bar that reads: "Remember: Everything Characters say is made up!"
Moffa warns of a "slippery slope" with these applications when you attempt to "replicate" a person and continuously return to the application for comfort. In a sense, you're attempting to make the person who died eternal, but this could be to your detriment. It could result in you relying too heavily on the application and ultimately make your grief journey more difficult and drawn out.
"What it does is it promises a never-ending relationship to this person who's actually not here," Moffa says. "And look, technology is fallible. What happens if something happens to the app?"
A half-century ago, Elisabeth Kübler-Ross famously elaborated the stages of grief: denial, anger, bargaining, depression and acceptance. But many mental health experts don't believe in such a strict trajectory.
Maybe some people don't go through stages in that way, or don't get to acceptance right after, and the idea of a standard process can make these people "really feel like they're failing at grief," Moffa says. "It creates even more of a sense of dread and shame and feeling like they're failing at something that needs to be perfect."
Grieving can be different for each person and is never linear, so you can only "grieve incorrectly if you intend on avoiding the grief process fully," Moffa says.
Which is a risk in turning to chatbots: They can be used as a way to avoid your grief. These applications could keep you from letting go of the deceased person entirely, because you may feel like you don't have to if they're a click or a tap away.
"If [the application] sounds real and looks real, our brains will make that association that it is them, especially over time," Moffa says.
If used in this way, these applications could give you false, and damaging, hope that your person is still with you, and they could isolate you from the real-life people who want to help you through your grief.
Though artificial intelligence can mimic your loved one's looks, voice and speech patterns, it will never be able to offer true human connection.
Turning to an "AI therapist" may be useful for a person who's struggling in a specific moment, but leaning on support groups, real-life therapists, friends, loved ones and your faith is what will often help you heal the most after a heart-wrenching loss.
"Depend on people," Metcalf says. "People are out there, and they want to help. We just have to be willing to open ourselves up to accept it."
Metcalf also encourages grievers to celebrate the life that was lost by honoring that person during holidays, anniversaries or special events, which are often the times that are hardest after loss.
If you do decide to turn to artificial intelligence, the main thing to remember is to be careful, especially if you're in the beginning grief stages. During this time, you may be more vulnerable to sharing sensitive information with gen AI chatbots or blurring the lines between reality and wishful thinking.
"We're always aiming towards immortality in some way — whether it's through our beauty industry, or whether it's through trying to evade death — but this is what takes away our humanity, and our humanity is sort of all we have going for us," Moffa says.
While I didn't feel as though these AI applications had a huge impact on my day-to-day life during my research, I did have to resist the urge to return to them and feed them more information to create conversations I've long hoped for.
After Cole died, I wrote him a letter. I obviously didn't get a response, but I know that the idea of being able to receive some form of one from "him" — or anyone who can no longer respond — is what could keep me, and other grievers, going back to these applications. And that could be to our detriment.
And though I'm no longer in the beginning stages of grief, as someone who lost a parent at a young age, I have felt as though my dad's been living in technology my whole life. He was, of course, the man in my mom's VHS home video collection, but for me, he was also Patrick Swayze in Ghost and Brandon Lee in The Crow. So, regardless of whether I ever use an AI application in this way again, technology is where my dad — and Cole, too — will always remain. But more importantly, in my heart.
The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health provider regarding any questions you may have about a medical condition or health objectives.
Source: cnet.com