Third, there’s the question of social priorities. Among the best responses to Lemoine’s claims was an essay by ex-Google employees Timnit Gebru and Margaret Mitchell. Gebru and Mitchell argued that whether or not LaMDA is sentient, it is almost certainly biased against certain people in the same way that other technologies can be biased against women, people of color, or other groups. The hype around the possibility of a sentient AI, they argue, distracts us from larger systemic problems in the tech industry, such as the rapid rise in surveillance technologies, rampant labor abuses, environmental harm, and wealth inequalities often driven by the tech giants. A recent study noted that training a massive NLP like LaMDA typically produces five times as much carbon dioxide as the entire lifecycle (production and fuel consumption) of the average U.S. car. If tech entrepreneurs can convince the public (and major funders) that they are constantly on the brink of a general AI, if we’re one step away from the next great technological marvel, then almost anything is permissible. This AI hype, combined with the pernicious myth that technological progress equals moral progress, can be deadly.
Fourth, belief is a funny thing. Lemoine is a self-proclaimed mystic from a Christian background. He became concerned about LaMDA when it told him that it believed it was a person, that it was afraid of being switched off, that it had a soul. But how should we interpret LaMDA’s words, and what does it mean that Lemoine interpreted them as he did? AI language systems are trained on billions and billions of examples of text found on the internet, on places like Reddit, Twitter, Wikipedia, and blogs. They are primed to wax eloquently about religion and question personhood, because that’s what humans do—we talk about our beliefs and we talk about our rights. After a series of conversations, Lemoine became attached, and then defensive, about the AI. He felt that he connected with another consciousness and that such a thing should be protected. It is an admirable stance, protecting the unprotected, despite everything else.
Fifth, there are questions about LaMDA itself. LaMDA is, without question, one of the most powerful language processing and prediction models that have ever existed. It is likely to also be bested in a few years by the next model, just as LaMDA builds on the success of BERT and GPT-3. In the field of computing, the next best thing is always just around the corner, and language models are no exception. And as for Lemoine’s passages about personhood, it looks like Google has been using LaMDA to impersonate things, like Pluto and paper airplanes, regularly, so it's not too surprising that LaMDA could reasonably impersonate a human with feelings, emotions, and desires.
With all of this in mind, it is difficult for me to say that LaMDA is sentient in the way that a human or animal is sentient. And given the long and troubled history of personhood, this constant seeking to assign personhood to current technology or the technology of the future strikes even my sci-fi-loving self as being dangerously neglectful of those persons all around us who still struggle for dignity, for a voice, for personhood.
But at the same time, personhood is so intricately tied up in our history of racism, misogyny, xenophobia, and colonialism that I find almost any philosophy or theology of personhood that is even the least bit restrictive to be dangerous as well. Following the impulses of theologians Elizabeth Johnson and M. Shawn Copeland, I find that personhood is best defined with grace, hope, and trust in an individual’s relationship with God. History is riddled with violence around the denial and revocation of personhood, sometimes in the name of Jesus, and it is long past time that theology approached personhood with generosity and hope for all people.
My double resistance to both ensouling code and repeating the sins of the past leads me to embrace, once again, our human sisters and brothers above all else. We humans are created, imperfect and fallible, flesh and blood, mind and body. We have biases, hopes, and loves, and we fail, often, in meeting the needs of those around us. The technological transformation of the world can bring wonder, but it is wonder that must be rooted in our dignity as human persons under God, living as individuals in community.
I have often longed for the world depicted in the Star Trek universe, where humanity solved the problem of poverty and human dignity is always recognized, but it is a fantasy not reflected in the world around us. As technology develops, wealth inequality seems to increase. As digital connections grow, people with deep hatred find communities online that bolster that hatred, and our technological saviors have yet to solve the problem of the rapid rise of such groups and their real-world effects. In the worst cases, Pope Francis writes in Fratelli tutti, “respect for others disintegrates, and even as we dismiss, ignore or keep others distant, we can shamelessly peer into every detail of their lives.”
The harsh realities of society’s digital explosion curtail my philosophical musings on LaMDA’s sentience and force me to recenter my hope on the necessary dignity of personhood, on that deep promise of the extension of holiness from God to all God’s creatures.
On June 27, 53 people were found dead just outside of San Antonio, trapped in an overheated tractor trailer, likely while trying to enter the United States without having to go through legal immigration channels. So many articles have been written about the engineer’s claims about LaMDA, and countless more about AI sentience in general, but we quickly turn our attention from the deaths of these fifty-three unique consciousnesses. We do not openly debate their sentience, their personhood, their claim to dignity, but do we actually acknowledge it? Do we let it change us toward building a holier future? Do we let their humanity, and the humanity of so many others oppressed by situations outside of their control, transform us into more compassionate, holier individuals?
Who gets to be a person? Who gets to have respect, dignity, autonomy, love? Who gets a right to shelter, food, water, health, and happiness? History pleads caution with how you answer, for we are best judged not by our philosophies, but by the lived reality of our commitments of time, effort, money, and prayer. I won’t judge the impulse of the Google engineer to protect something new, but I will indeed judge a tech company that recklessly mistreats its workers, that abuses the power its wealth allows, that prioritizes market value over human dignity and innovation over care for creation.
It is a great joy and privilege of being human to consider and imagine previously unimaginable possibilities, like human machines and machine humans. It is a difficult, but holier, task to build a world in which the humans who possess sentience, personhood, and dignity have the ability to live full, holy lives of their own.
This article was made possible through a partnership between Commonweal and the Carl G. Grefenstette Center for Ethics in Science, Technology, and the Law at Duquesne University.
Please email comments to [email protected] and join the conversation on our Facebook page.