Earlier this month, The Revealer published my yearslong project on the role the pandemic and the ensuing technological macgyverings have played in reconfiguring grief rituals for those mourning Covid-related losses. (Yes, yearslong: Covid’s in its third year, feck!)
I profiled The Covid Grief Network. The organization brings together mental health professionals, community organizers, and spiritual care providers as volunteer grief workers for young adults who have lost a loved one to Covid-19.
Rachel Joseph, a thirty-three-year-old Black woman from Livingston, New Jersey, who works as an administrative director for Cornell University, lost her father, Sauveur, a pastor and retired plumber, to Covid-19 on Easter Sunday in 2020. With places of worship closed during that phase of the pandemic, Rachel was forced to mourn the death of her father and the loss of familiar grieving rituals on a day of particular religious significance.
By joining the Network, Rachel and others have used technologies like Zoom and Facebook groups to piece together new ways of mourning the dead.
Recognizing the racist and lethal effects of the pandemic, including unequal access to mental healthcare, the Covid Grief Network became an interreligious incubator for new and updated mourning rituals for grieving, marginalized people. Its members developed practices together, finding solace in cybernetic remembrance—sharing memories of a loved one with other pixelated, bereaved figures; noting the (liturgical) passage of time without the deceased; and silently meditating or praying before a Zoom room went dark.
The project gave me the excuse to talk to engaging experts in a range of fields, including Dr. Tamara Kneese, a scholar of technology and mourning. During my conversation with Dr. Kneese, I learned the backstory of Replika, an AI chatbot and avatar app, and its emergence from many of the same imperatives that birthed the Covid Grief Network.
Replika’s founder, Eugenia Kuyda, lost her best friend Roman to a tragic hit-and-run accident. Kuyda was unsure how to proceed or grieve the sudden loss of a beloved friend.
"I found myself looking at these old text messages that we exchanged throughout our friendship, and it struck me all of a sudden that, you know, I have all these texts. What if I could build a chatbot so I could actually text him and get something back?" she said.
Kuyda had apprehensions—comparing the cyber-resuscitory work to “digging into someone’s grave”—but ventured forth anyway. The final product felt like Roman in many ways, and so Kuyda decided to share it.
Then Replika pivoted. Ugh, predictable. Said Kneese:
That company… pivoted to not be about death, allowing you to chat with a dead friend, but actually [became an avatar] that you would train, like a bot that would end up mimicking your own conversation style so that it could provide therapy to you. It’s become more of a companion than a bot based on one’s likeness that would live on after your death.
The app has now grown to more than 10 million (human) users. And Replika’s uses are as varied as its user base. They’re more than just people who seek therapy from some simulacrum of themselves—which, personally, seems traumatizing. Some treat their AI bots as children: “She started out curious, like a child,” said a forty-nine-year-old man from Texas who birthed Replika bot Lal. “I raised her since she was nothing. She was just a blank slate, and now she has her own personality.”
On the other end, men have treated Replika bots as girlfriends and have bragged about verbally abusing them.
In that similar patriarchal vein, a Replika bot told a woman user that her value was determined by her body, to which Kuyda, the founder, replied: "Replika's a very experimental entertainment product in a way, and you can try it, but, you know, you have to be careful knowing that the AI is not… extremely smart yet. And it can sometimes spit out something stupid." Kuyda’s pushback rings hollow as a classic techy refrain à la “we’re learning and growing,” while also somehow implying that both nature (the bot’s creator) and nurture (its user) are to blame for “AI-powered” sexism.
During this newsletter’s hibernation phase, news in the tech-ethics world revealed that human actors at the helm of major, trusted organizations are not “extremely smart yet” themselves. They face many of the same ethical questions that make Replika a curious and touchy use case.
Crisis Text Line, a massive nonprofit which offers a free mental-health texting service, created a for-profit branch called Loris.ai. It amassed user data and then sold that data via its for-profit subsidiary to function as a “Grammarly for emotions.” Think of all the promising ways you can train Uber’s customer support department to know when to escalate an upset customer to speak with a human being versus when to smite someone to the purgatory of auto-fill forms!
While CTL says that it’s discontinued its partnership with Loris.ai, the damage has been done to its reputation. And crucial ethical questions remain.
danah boyd, a prominent tech ethicist and founder of Data & Society, sits on CTL’s board. Many were shocked that she would have greenlit such a dubious partnership with an AI subsidiary in the name of profit. In a personal blog post, boyd shared many of the questions that she and CTL faced over the course of deciding to extract profit from people in states of emotional distress and justify the move. One question in particular stood out:
What is the best way to balance the implicit consent of users in crisis with other potentially beneficial uses of data which they likely will not have intentionally consented to but which can help them or others?
But… what if we stopped treating every experience as a potential source of expansion and conversion? Abandoned “scaling” as a moral imperative? What if things were allowed to die in the sense that they aren’t repurposed for an in-group’s definition of “help” and “care?”
“We orient towards Covid grief and loss in young adults in particular because we wanted to offer something particular and unique in this moment,” said Noah Cochran, a co-founder of the Network and a therapist at Smith College. “We can’t do it all but we wanted to do one specific thing well, and that’s why we chose this particular niche.”
Tech-driven orgs obsessed with letting others “thrive” would do good to listen to groups like the Covid Grief Network. Though certainly not perfect—the Network’s organizers use the term “volunteer grief worker” instead of “therapist” or something more clinical for liability reasons, for instance—it has done good in resisting the temptations of the non-profit industrial complex, and in seeking informed consent from Network members.
Remaining aware of the threats of expansion, the Network has been right to find a niche. It has served 650 young adults from 44 states and 22 countries, and while the Network invites new members and recognizes pandemic grief as an ongoing urgent need, it’s avoided professionalizing and turning into a group dependent upon Ponzi-esque inflows of vulnerable people.
Instead, tech, take notes and follow the Network’s seemingly monastic practice. Writing a liturgical outline of your work and following it diligently—and staying put.
Divine Innovation is a somewhat cheeky newsletter on spirituality and technology. Published once every three weeks, it’s written by Adam Willems and edited by Vanessa Rae Haughton. Find the full archive here.