(Mojahid Mottakin/Pexels)

In October 2022, the Grefenstette Center for Ethics in Science, Technology, and Law at Duquesne University hosted a conversation on AI, algorithms, policy, and faith. Headlining this conversation were Bishop Paul Tighe, Secretary of the culture section of the Dicastery on Culture and Education at the Vatican, and Alondra Nelson, deputy assistant to the president and principal deputy director for science and society at the White House Office of Science and Technology Policy (OSTP). This conversation was moderated by John Slattery, the director of the hosting center at Duquesne. The following conversation was edited for length and clarity.

John Slattery: Bishop Tighe and Dr. Nelson, let’s start with the simple questions—why are we here in conversation?

Bishop Tighe: Firstly, Dr. Nelson, thank you for being here and for being part of this dialogue. My presence may be less obvious. Why would somebody from the Church be here talking about technology? That's not really our core business, is it? But one of the things people of faith must understand is that our faith doesn't exempt us from being part of the world in which we live. On the contrary, an integral part of our faith is having a concern for the world, for its future, and for the people who live in it. And one of the issues that's clearly going to shape that future is how digital technologies are transforming our politics, our culture, how we live socially, and transforming education. Artificial Intelligence—AI—promises to go even further and obliges us to think deeply about what it is that makes us human, what it is that makes life worthwhile, how we think of ourselves as being different from other creatures.

JS: Dr. Nelson, let’s start with your new venture released this month by the OSTP: The AI Bill of Rights. Can you walk us through this and tell us why it’s important for us to understand?

Alondra Nelson: Thank you Dr. Slattery, and thank you Bishop Tighe for joining me in conversation today. This document that we released from the White House a couple of weeks ago, the Blueprint for an AI Bill of Rights, is really a document that means to foster dialogue, so I'm very pleased and honored to be in dialogue and conversation with all of you today.

Let me tell you about the five core protections. First, the systems that we use should work and they should be safe and effective. And if they're not safe and effective, they should not be used. Second, you should be protected from algorithmic discrimination, and automated systems should be used and designed in an equitable way. Third, you should be protected from abusive data practices via built-in protections, and you should have agency over how data about you is used. Fourth, you should know when an automated system is being used and understand how and why it contributes to outcomes that impact you. And fifth, you should have the opportunity to opt out: to have access to a human being or another way to access the right, the opportunity, the thing that you're trying to do, or another way to quickly solve a problem.

We think these principles outline the kind of world that we all want to live in, the world that we should expect. It will take all sorts of different levers to get us to this world.

JS: Bishop Tighe, you've been a part of a lot of standards and guidelines published around the world. What do you feel is important about such documents? What are the difficulties and limitations?

Bishop Tighe: I’ll be honest: I begin with a certain weariness for all these documents, thinking, okay, here is another statement of great rights that will struggle to be relevant. In 2020, Harvard’s Berkman Klein Center analyzed dozens of codes of ethics around the world and found eight core areas of ethics, so we know what they are. But what I liked about the White House document is that it gives it granularity, it digs down. It's rooted in thinking about the people who are working in this area. There's enough in it to give it meat to engage people. It's grounded in reality.

A friend of mine, the head of the Fundamental Rights Agency in Europe, said that the last thing we need to do is invent new rights. We need to apply the existing rights. From a Church perspective, that’s not fundamentally our business. Governments need to regulate. Companies need to bring codes of practice not just to avoid rule-breaking, but actually to become ethical businesses. The Church needs to be in dialogue and discussion with all of the above.

JS: Bishop Tighe, I want to briefly ask about the relationship between technological wealth and global inequity. How do we talk about tech ethics or responsible tech or trustworthy AI while still attending to these deep-rooted societal problems of inequity and poverty that are often exacerbated by modern technology?

Artificial Intelligence—AI—promises to go even further and obliges us to think deeply about what it is that makes us human, what it is that makes life worthwhile, how we think of ourselves as being different from other creatures.

Bishop Tighe: We often think about the civil and political rights—privacy, autonomy—but we don't think as much but the economic and social rights. I think inequality is best understood as a layered problem. One layer is the glaringly obvious wealth gap between the few and the many, which seems to be rapidly increasing alongside control of the technology and associated algorithms. However, beyond the material, there’s also the level of access, power, and influence in terms of lobbying, shaping political processes, determining where laws will be made, and jurisdiction shopping. Arundhati Roy has this lovely essay in which she’s in her obviously comfortable office in New Delhi writing, and she's working on a laptop. She said outside in the street, the workers are laying the fiber which in India they do by hand. They're scraping the ground and they're breaking for slight meals. They're working by a very weak candlelight. She says she has a vision of two people, two journeys, two caravans heading away, a small group heading off towards the light and a whole other section disappearing off into the darkness.

Wealth inequality continues to exacerbate a real loss of the sense of solidarity around our human destiny. In this way, I sometimes prefer taxation to philanthropy in that democratic processes decide where we're going to go with things. Pope Francis also picks this up, as do many modern philosophers, like Michael Sandel, who criticize the myth of meritocracy, the myth that you inherently deserve whatever money you make. We don’t all start on a level playing field, neither in one country nor around the world. There are so many advantages and disadvantages and reasons why certain groups are destined to do better. Wealth inequality comes from inaction by governments and irresponsibility by individuals to understand that earned money is not tied to skill or some sense of truly earning. This myth of meritocracy is especially found throughout the tech industry, which explains why inequality has grown so much in the last twenty years. We have to argue back, we have to educate, and we have to be so careful, because this dangerous myth can distort everything–even education, where education isn’t something to help you grow as a person but just becomes another leverage point, another credential, to get me on the right side of that division of wealth, to get me in the right group, rather than something to help me address the fundamental questions of existence.

JS: Let’s shift to hate speech. Leaders, both in tech and in government, seem to struggle with how to allow freedom of speech without assisting the rise of hate groups with their very real-world violence. Bishop Tighe, a lot of this hate speech comes from self-proclaimed Christian groups and individuals. How does the Church address this? How can we combat this problem as people of faith as well as citizens of the global technology community?

Bishop Tighe: Over the last sixty years the Church has made clear repeatedly, in the highest levels of teaching, that there is no justification for anti-Semitism in the Church. You can see this particularly in Nostra Aetate from Vatican II, which talked about the Church’s relationship to Judaism, the importance of that relationship, and the shared origins that we're all part of the Abrahamic tradition. And that I think was marked in things like the Pope's visit to Iraq when he spoke constantly with this awareness of Islam, Judaism, and Christianity having a shared root.

Hate speech, of course, wasn't created by digitalization, but digitalization has certainly accelerated and pushed it out there into the common square. The anonymity that's often facilitated by digitalization has also helped fuel the rise of online hate. But it’s important to remember that this digital culture is made by a huge number of individual choices. Will I share something? How will I comment? Will I let the mood of another person's intervention determine my own? I would call for more mindfulness or attentiveness to what they're doing. But it’s not all individual! Polarization and hate speech are also promoted in digital spaces because the companies, the business models, send you material they think you're going to like. Companies want to keep you in the loop of their thing, and thus make us more prone to never encounter people who are different than ourselves.

JS: Dr. Nelson, what is the Biden-Harris administration doing about this rising issue, online and in-person? It seems like every week we read of new online and real-world violence fueled by hate.

AN: Well, I’ll echo a lot of what the bishop said. We have to be attentive, as individuals, corporations, and policymakers, in addressing the lack of traction around the ways that technologies fuel discrimination and anti-Semitism and other forms of hate. The algorithmic amplification piece that the bishop alluded to is a real problem in this way. We need to abide by the First Amendment and also get a real handle on the algorithmic amplification that serves these things up to people. This is a role, especially, that religious leaders and government can have–envisioning and creating a model of a world that we want to live in, one that you can't compel people or force people to do or to create.

It’s important to remember that this digital culture is made by a huge number of individual choices. Will I share something? How will I comment?

The Biden-Harris administration has worked hard to be responsive in this space. The president designated a special envoy for combatting and monitoring anti-Semitism, which is a new and important role. We also had a national convening that centered on anti-Semitism and other forms of racism, bias, and discrimination. And the groups that are working to forge a way ahead—including some leaders from technology and business—were there as well. This is the part I mentioned earlier, that community leaders, corporations, and policymakers have to work together, since there are some things that can only be done in the social space, some in the governance space, and some that can only be done in the technology space.

Bishop Tighe: One thing I’d like to add from my own experience of hate speech comes from Northern Ireland, from small but nasty and very ingrained hatreds across communities. And one of the things that became hugely important was getting people to actually know each other, to meet each other, across religions, across borders, across cultures. It's not easy to create those opportunities, but I think one of the things with the digital world is that, with the rise of Zoom and other video technologies, we don’t need to be satisfied with faceless communication any more. There’s something lost when we can’t see the other. When I look another person in the eye, a certain common humanity comes out.

About four or five years ago across Europe, there was a fear across newspapers, social media, and TV that the faceless immigrant hordes were invading—people who were coming out of terrible situations in Syria and were coming across the borders, that these people were going to threaten our values, threaten what Europe stands for. The language was and is hysterically dangerous, just like the language used about immigrants at the Mexican border of the United States sometimes.

To try and counter this, Pope Francis chose to go to the island of Lesbos and visit some of those people. And I think he did this to give a face to these very harmless, damaged, marginalized people. The rhetoric of the horde somehow had to yield, it had to break, as it did when we all saw the image of the tragic death of the young kid washed up on a beach in Greece. Somehow we need to see the face of the other, we need to encounter the other. It's too easy to hide behind the anonymity and the stereotypes, especially online. We need to break through.

JS: What are some technologies today that have already had a really positive impact on some of these issues in society? And what are some advances that you're looking forward to or hoping to see in the next decade?

AN: At OSTP, we've got lots of folks in our office working on things like automated science and technology. There are things in scientific laboratories that you can do a lot more quickly through automation: research in agriculture, research in health. You can help farmers grow food more readily, help doctors identify and cure disease more readily, help small business owners have access to enterprise tools that big corporations and large companies have for their organizations. So there's a wide range of possibilities that we're pretty optimistic about. And I think part of the  blueprint for any AI Bill of Rights is that these things can be done well, that they can benefit society, but it’s not inevitable. Part of the important work of innovation is that we must also build in equity, both wealth equity as the Bishop talked about but also across race, gender, and other societal divides.

Bishop Tighe: I want to endorse the positive vision. A friend of mine says if you doubt the benefits of science and technology, look at dentistry 100 years ago! My own interest is actually exploring the very extraordinary things. There is potential in how these technologies can be used, because there is in people a desire to connect, to know each other, to build community, even if some technologies can also for the moment be doing the opposite, dividing us. But we can take them in hand to recover that sense of the unity of the human family.

And I think what I'd also add is how artists and writers and storytellers can use these amazing new technologies to be the ones that help us to forge a sense of identity across different cultures and different environments. That engages us at the level of empathy and awareness, too. I think of the potential for storytelling that makes us alert to people who are different from us, to their human dilemmas, to their needs, so that we have that ingrained sense of the unity of the human family. We live in a world where it's too easy to push the hatred and fear buttons, to come to only rely on my people with me and no one else. How do we support people who help us to imaginatively see the other and understand the other and learn from the other? That's where I see great hope. STEM education and new technologies are important, yes, but we need that human touch, that human connection, that human imagination, to be amplified and appreciated even more through technologies.

This type of true dialogue, no matter who it’s between, isn’t an extra thing we can do in these conversations about technology—it is the most basic type of work needed to create a better world.

JS: My final question is one of dialogue. What is the importance of dialogue between the United States and the Vatican in this space, and what are some of the practical things that the Vatican and the White House can work together on, to be a part of the solution to some of these problems of technology going forward?

Bishop Tighe: In short, this is a global technology and it's going to have an impact globally. Given the diversity of different political systems, religious beliefs, and ideologies, we have to find a way to make judgments together about what is truly going to be beneficial for humanity and what could be problematic for humanity. This is one of the areas where I think Pope Francis has been working diligently: can we find a way? His intuition is that these solutions will only be found in dialogue, and that it has to be a dialogue that's truly inclusive. We can't leave it to experts. We need representatives of different traditions who will dialogue together in the hope of searching for truths that will help us to orient the choices we make and the directions we give to technology—it's easier said than done.

A language that will help us on that is the language of human rights, because for all the criticisms that there may be about human-rights language, it's a language that has achieved a certain credibility in the global community and marks for us certain ways of thinking about things that we shouldn't do and reminds us that we must respect the basic rights of people.

But, you know, it’s actually more than that. I think what the Church can offer to this dialogue is the notion of inherent human dignity. Rather than thinking just about the individual rights, we must understand that those rights are rooted in a vision of the human person having innate and intrinsic dignity. Human rights are not something that are conferred to us by a society, by a world government, or by anybody else! They come from our being human and all human beings have this fundamental dignity and worth that we must recognize.

So much of your worth and value as a person in digital media is performative. You have to prove yourself. You have to celebrate your achievements. You have to look right. You have to win the approval of others. The Church’s understanding is that human value and human dignity is something we have by virtue of being human, not something that has to be earned, performed. Our technologies are very good at measuring, but worth has to be separate from any sense of measurement.

AN: I couldn’t agree more. I’ll admit, in developing the policy planning process for the AI Bill of Rights, we didn’t have religious communities at the top of our list. So one of my takeaways today is to think of religious communities as a particular sector that we should engage more deeply. What I've appreciated about this conversation is the sense that the Church and government have a kind of obligation to help people see the world that they want to live in, to help people understand the world in which they're living, to put people of different perspectives and backgrounds in conversation, and to be in service to people of different perspectives. I would hope that the White House and the Vatican can take those kinds of profound overlaps as a way forward for a continued dialogue.

Bishop Tighe: I’d say, particularly to Catholic educational institutions, and to the U.S. Church: Look, think about your institutions and your people. Are they engaging with these conversations? An interesting example is a number of people from Silicon Valley who have identified as Catholic and are working in a tech environment. They don’t want to leave their values behind them. Neither do they think that they are the only ones who have values within their world. So they’re trying to work with coworkers.

We also need to celebrate the conscience of individuals. Some of the greatest ethical stuff I’ve seen in the technology space has been individual engineers refusing to work on particular projects at a sacrifice to themselves, getting removed from projects, sometimes getting fired publicly because of an ethical stance. People have been willing to go public and say, look, this business model is harming people.

If we allow ourselves to focus on individuals, we will find people with a willingness to risk and to reach across and learn from the other. Governments and tech companies and religious leaders can learn from these individuals and help to foster these opportunities. Pope Francis teaches again and again that dialogue is about encountering the other not as you want them to be, but as they are. This type of true dialogue, no matter who it’s between, isn’t an extra thing we can do in these conversations about technology–it is the most basic type of work needed to create a better world, offline and online, in which everyone can have dignity.

John Slattery is the director of the Grefenstette Center for Ethics in Science, Technology, and Law at Duquesne University.

Also by this author
© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.