Bishop Steven Croft On Keeping Humanity at the Centre of New Technology

Season 3 Episode 7
Bishop Steven Croft on Keeping Humanity at the Centre of New Technology

It is important to keep that which makes us human at the centre, both when building and using new technology.

S3E7 / February 18, 2021

BT S3E07 Guest-Headshot-1200.png

Episode Description

In the early days of the internet, information technology could be viewed as morally neutral. It was simply a means of passing data from one point to another. But, as communications technology has advanced by using algorithms, tracking and identifiers to shape the flow of information, we are being presented with moral and ethical questions about how the internet is being used and even reshaping what it means to be human.

In this episode of Big Tech, Taylor Owen speaks with the Right Reverend Dr. Steven Croft, the Bishop of Oxford, Church of England. Bishop Steven, as he is known to his own podcast audience, is a board member of the Centre for Data Ethics and Innovation and has been part of other committees such as the House of Lords’ Select Committee on Artificial Intelligence.

Bishop Steven approaches the discussions around tech from a very different viewpoint, not as an academic or technologist but as a theologian in the Anglican church: “I think technology changes the way we relate to one another, and that relationship is at the heart of our humanity.” He compares what is happening now in society with the internet to the advent of the printing press in the fifteenth century, which democratized knowledge and changed the world in profound ways. The full impacts of this current technological shift in our society are yet to be known. But, he cautions, we must not lose sight of our core human principles when developing technology and ensure that we deploy it for “the common good of humankind.” “I don’t think morals and ethics can be manufactured out of nothing or rediscovered. And if we don’t have morality and ethics as the heart of the algorithms, when they’re being crafted, then the unfairness will be even greater than they otherwise have been.”

Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

Steven Croft: The world needs a very big conversation about this and it cannot be a conversation simply between those who are developing the technology and the technical people. It needs to be a big moral and ethical conversation.

Taylor Owen: Hi, I'm Taylor Owen and this is Big Tech.

Taylor Owen: Over the course of the past year, I've been taking part in an international working group on data governance, organized by a London think tank, the Ada Lovelace Institute. Every month or so we do a big Zoom call with experts from all over the world and with all the faces on my screen, there's always one little square that stands out. The one that belongs to Steven Croft. Steven is the Bishop of Oxford so the main reason he sticks out is because he wears a clergy robe and clerical collar. But the contribution he makes to these calls is through his focus on humanity, a focus less on tech itself and more on the ways the human experience is being touched by technology. In policy conversations that are often very technocratic, Steven is deeply philosophical. He spent decades thinking about life's big questions like what does it mean to be human? What does real community look like? And how does spirituality shape our lives? And now he's bringing that thinking into the technology conversation. And as a former member of the Artificial Intelligence Committee in the House of Lords and current board member of the Centre for Data Ethics and Innovation, Steven isn't just thinking about tech policy, he's actively shaping it. After centuries of religious dogma, it often feels like we now live in an era of technological dogma. We talk about tech companies being too big to fail. We debate whether AI or biotech will supplant the human mind and body. And the Elon Musk's and Steve Jobs of the world have cultivated almost godlike personas. 300 years ago, it was science that posed the challenge to religious authority. Now the tech companies are the ones with the power and are using it to push into spaces and conversations that were once the domain of religious institutions. Steven uses precisely this historical and spiritual framing to challenge that power and to demand that we think more holistically and even humanely about technology. Here's Steven Croft.

Taylor Owen: I wanted to start sort of very broad here and I think a lot of people would be at least somewhat surprised or intrigued that the Bishop of Oxford is an expert on technology and increasingly on technology regulation even. And some might even think that science and technology are incompatible with religion. And there have been moments in history where there's been real tension there. I'm wondering how you think about that relationship.

Steven Croft: Yeah. No, for me, it's all about what it means to be human and human identity. My way into this whole area was through one of my sons. I have four adult children, they all do something different, but my eldest son is a computer games entrepreneur. He set up a computer games company with a colleague when he left university. About four or five years ago, he gave me a book to read that he'd been reading by a guy called Kevin Kelly called The Inevitable. Kevin Kelly's a former editor of Wired magazine. And I dipped into this and I came across this astonishing paragraph, which kind of leapt out at me, which was that actually the real challenge of digital and artificial intelligence over the next 20 or 30 years was going to be asking questions of human identity and what it means to be human. And one of the questions machines were going to provoke was what it means to be human and human flourishing. And as I read that paragraph, it sort of piqued my curiosity and made me ask a series of hard questions. And in particular made me think that if this technology was going to be all pervasive and all across the world and have huge ramifications, then the Christian faith, which is my own tradition, had something to say to that because we've been thinking and reflecting about what it means to be human and about human flourishing for 2,000 years or more in the Jewish and Christian tradition. I began cautiously to explore this portfolio of technology and the regulation of technology and data. Served on a House of Lords committee, Senate committee, on artificial intelligence and the future, which was a massive learning experience and kind of sat in a room for three hours every Wednesday for a year, with a chance to hear from and to ask questions of the leading experts in the UK on this field. My brain kind of exploded every Wednesday afternoon. And a huge learning curve. But it became apparent through that inquiry, not just through my input, but the whole committee concluded that actually ethics needed to be lifted up within dialogue about artificial intelligence and data. And that was one of our principle conclusions. And then I've served since then on the UK government's Centre for Data Ethics and Innovation, trying to map what needs regulating and what doesn't and what the real questions are.

Taylor Owen: And I definitely want to talk to you about the regulatory piece, but I want to lose this notion of both what it means to be human and the historical arc of theology engaging with that question. Have there been previous moments where that question has been piqued, where something has been introduced in the world that has fundamentally challenged theological notions of humanity?

Steven Croft: I think technology changes the way we relate to one another. And that, relationship is at the heart of our humanity. And I think the era which is probably most similar to the present era historically, is the introduction of the printing press and the change that happened in technology there where there was a sudden democratization of knowledge and that affected the world in profound ways. In the history of Christian theology, it's the era of Martin Luther and John Calvin and Thomas Cranmer and the other reformers that changed the political map of Europe, all flowed from a change in technology and the way we communicate with each other. It's not too much to say that the world is living through a similar democratization of technology and a change Of relationships. And that certainly affects the life of faith communities and churches hugely, but also the way in which they're able to speak into and listen to their wider communities, make their resources available and have those resources challenged. It's not so much that the technology challenges the fundamentals of particular faiths or philosophies, it's more that there's an opening up and a democratization of dialogue, which is really helpful. And the faith voices are needed as part of that rich human dialogue going forward.

Taylor Owen: And perhaps in the past have helped stem some of the turmoil that was a consequence of that democratizing of information. You look at the printing press, that was a time of pretty significant uncertainty and instability in the world, wasn't it?

Steven Croft: Yeah. Yes, it was. And lots of change really. And so finding moderation through that and humanizing what is happening through technology is really important. Of course, churches were not and are not perfect communities and there were bad consequences of that engagement as well. But the present fourth industrial revolution the kind of technology which is happening is challenging really fundamentals of the way human identity is bounded. The way we let technology companies into our inmost lives and the way that affects our sense of who we are and our community, the way we're never switching off if we're not careful, from the wider world, affects us hugely, particularly when identity is being formed in children and young people. The world of work is changing hugely because of technology. There are so many areas where technology is changing human life and human society, that it's important to have a big conversation across the whole of society about where we're going and what that's for.

Taylor Owen: I wonder if the conversation around AI, which explicitly uses the mimicking of the human mind and has a pretense of sentiency ultimately at its core or at least an objective of one day achieving human like characteristics, has that turbocharged this technology conversation in your world? It feels like it's very different than just talking about platforms or cellphones or whatever it might be or the changing nature of work. AI is striking right at the core of that human question, isn't it?

Steven Croft: Yeah. Yes. It has. although my perception of how that is happening has shifted as I've explored the world of AI and data. I think when I began the journey four years ago seriously, what, as it were, kept me awake at night were the possibilities of artificial general intelligence and the stuff of science fiction really and sentient machines and what that would do to our sense of humanity. That intrigued me and hooked me into it. But as I explored that further, it seems to me that the prospect of those kinds of breakthroughs are really very, very, very distant indeed, in terms of the technology. And what began to keep me awake at night though, was the stealthy adoption of narrow artificial intelligence across quite specific fields, which were not being monitored or governed or supervised and weren't necessarily being thought about. Questions about what happens to our data, questions about the automation of tasks and work and particularly algorithmic decision making in very sensitive areas of human life, such as the courts or financial or human resources, without governance. And they seem to me to be fundamentally affecting people's freedoms. And then we had the debates about the role of the tech platforms in democracy and democratic debate, which is still hugely current. It seemed to me that all of those AI driven technologies, which were using data, we're putting immense power in the hands of individuals, which was largely unaccountable power and that was something to be concerned about for the whole of society.

Taylor Owen: Yeah, that's interesting, because those are societal questions and democratic questions and community questions. They aren't just about our intrinsic humanity or what it means to be human. It seems like you and some of these technologists we're starting at this core principle, but then it became about these so many other things.

Steven Croft: Absolutely. And the ramifications, just kind of spin out and one of the core reflections has been how is AI and how are the data technologies best deployed for the common good of humankind? And even to use that language is to draw on our whole tradition of Christian and particularly around Catholic teaching and thinking about what the common good is for the world and how you discern it and acting for the good of all, not for the profit of some.

Taylor Owen: Do you think these technologies can be moral or immoral? Or is it how they're deployed in society? Is it valuable to think of them through a moral lens?

Steven Croft: I think it's essential to think of them through a moral lens. I think we have to use language quite carefully when it comes to discerning whether the technology itself is moral or immoral. And I think at an earlier stage in the development of the technology, it would have been much easier to say the technology itself is morally neutral. I think with the advent of algorithmic decision making and very complex decisions being made, it's not that the technology is being intentionally immoral, but the outworking of the technology may be opaque and may have unforeseen consequences, which may not be moral. There's a more complex equation going on now.

Taylor Owen: Yeah. And morality is one frame, but also what neutrality and bias and all of those things. Are other ways of getting into that.

Steven Croft: Yeah, I know. Fairness is absolutely in the frame when it comes to morality. We had a really interesting and disturbing situation in the UK last summer when, because of COVID people couldn't sit their A-level examinations. And so A-levels were calculated through an algorithm, but because of the way the algorithm was put together, it advantaged people who were in smaller classes who didn't have the same average estimate and people who were at schools which had historically had worse grades were also disadvantaged. And there was a public outcry because of the unfairness of this algorithm, which wasn't the algorithm itself making moral choices. It was the way it was set up was fair to the greatest number of people, but not to a number of individuals who might otherwise have been disadvantaged anyway and was seen in the popular mind to be untrustworthy therefore.

Taylor Owen: And that event spurred that remarkable scene of people protesting an algorithm.

Steven Croft: It did, that's right, yeah, yeah.

Taylor Owen: The signs against the algorithm.

Steven Croft: Yeah, well the government of the day blamed the algorithm very publicly and that in a way, setback algorithmic decision making or broke public trust and confidence. And the more journeying I had done with those who are trying to develop these decision making processes for the common good, the more it's become apparent that retaining public trust and confidence in the technology is absolutely critical for realizing the good that the technology can bring. AI and data processing can bring huge benefits, particularly in the areas of medicine and diagnosis. But it won't bring those benefits and we went breakthrough to the really good things that could be released without that transparency and good governance.

Taylor Owen: And do you think a religious frame could actually help how these technologies or these things are built by people? Algorithms are programmed by people with incentives and objectives and subjectivities and biases, could some sort of religious or moral framing help in the actual development of those tools?

Steven Croft: I think it's essential, the moral framework. And I think one of the sources of morality and ethics within the world are the great faith traditions as well as the philosophies. Yes, I think it's essential. I don't think morals and ethics can be manufactured out of nothing or rediscovered. And if we don't have morality and ethics at the heart of the algorithms when they're being crafted, then the unfairness will be even greater than it would otherwise have been.

Taylor Owen: They need to be a founding almost first principle, too.

Steven Croft: They do, they do. And all those who are involved in the development of algorithms and the oversight of algorithmic decision making need an ethical foundation. Absolutely. Clearly.

Taylor Owen: You had an article in the Church Times a few years ago that that struck me and particularly a line. "That in the 19th century and for much of the 20th century, science asked hard questions of faith. Christians didn't always respond well to those questions and to the evidences of reason. In the 21st century however, faith needs to ask hard questions once again of science." I wonder what you mean by that. First, what were those original questions that science was posing of faith? And how now should faith be questioning science?

Steven Croft: Thank you. Well, in the 19th century, there was a famous debate in Oxford in which one of my predecessors publicly debated one of Charles Darwin's disciples about The Origin of the Species. And basically my predecessor, the then Bishop of Oxford was opposing the development of evolution as a scientific principle because it was felt to contradict both the biblical record and endanger the understanding of the dignity of the human person. By saying that we may, in some way, be descended from other primates. And in the debate between science and faith that ensued here, it was definitely the scientists speaking to the people of faith. And certainly on this side of the Atlantic, most people of faith have come to accept the scientific accounts through evolution and that is not displaced our faith in God.

Taylor Owen: That was a good caveat on the Atlantic divide.

Steven Croft: Yeah, indeed. Yeah, I know it's not the case across where you are, but I think in the 21st century, science has now accrued a great power and there is great power in technology. And the position of faith is almost like the position of some of the ancient prophets in the biblical record of needing to challenge an accountable power and those who are exercising great influence over the whole world and over generations without being questioned about their moral values and the framework which they're exercising. I think the technology companies need a similar challenge and mine's only one voice among many. There are plenty of people in the world who are articulating that challenge. But I think the speaking from a religious tradition, a Christian tradition, there's a foundation for doing that. Are the weak being protected? Are the rights of children being set above the rights of technology to exploit them in different ways? And I think all of that needs good scrutiny.

Taylor Owen: That analogy to prophets feels so aligned with many of the attributes of these technologies. Steve Jobs is held up as a prophet. These founders are often seen as being all powerful. The tools they're building have a pretense of prophecy. They can predict our behaviour and therefore be all powerful and to control us in our democracies and the way we behave and act. Yuval Harari in Homo Deus talks about this combined prophetic power, almost birthing a new religion and that Silicon Valley has elements of religiosity to it. Do you see that emerging in that culture?

Steven Croft: Yes I do. But I also see themes of pride and hubris and I think I would see the prophetic voices more being in those who attempt to identify and challenge the system. The work of somebody like Shoshana Zuboff in trying to expose surveillance capitalism and the financial models by which people profit. The work of those who've tried to uncover the Facebook Cambridge Analytica scandal and its role in democracy. I think when anything gets too big to be challenged, it needs scrutiny. And I also think that the technology companies are not neutral when it comes to their own values. There is a value system at the heart of Silicon Valley and a philosophy at the heart of Silicon Valley, which is about individualism, which is about the dream of making a vast fortune, partly about a dream of improving the world. I recognize that. But it doesn't always have a concomitant concept of the evil and harm that can be done if technology goes wrong. And one of the fundamental ideas in the Christian tradition is the idea that there are always negative sides and bad consequences and shadow sides to people and that potential for evil and harm needs to be recognized, which is one of the reasons why good governance and scrutiny and transparency is so important.

Taylor Owen: Yeah. Community and the power and strength of community is at the core of the Christian tradition. And as you say, these platforms have fostered an idea that connection and community is at the core of what they do, while at the same time being very individualistic. Their individual agency is actually the core of what platforms enable not necessarily connection and community. Do you think they're thinking about community wrong? Mark Zuckerberg loves talking about community and connection being the core of their business, but is that really what they're doing?

Steven Croft: I think they're doing much more than that. I think there are some parts of their ideals. I want to affirm it is good to connect people. It is good to open people up to diversity and different experiences and people are looking for love and for friendship and for affirmation. But I think people need to be alert to, for example, when that desire for love and community turns into the intentional fostering of addictive behaviours through hooking people in, in deeper and deeper ways. When it creates silos around people so they only listen to a small section of opinion. When truth is distorted. I think that the tech companies need to be continually alert to the ethics and the effects of what they're doing and they need external scrutiny. And I just think it's one of the deep lessons of human history, that there are always bad consequences, even of technologies which appear good. We need to be alert to those and mitigate those.

Taylor Owen: Something I really struggle with because you do get with these tools, some meaningful sense of connection. And in many ways that's a feeling you get of happiness. It's a visceral thing when you use these tools. And is that just superficial connection? Or could these technologies actually foster more meaningful community?

Steven Croft: Well, one of the things I've observed doing during the last year of lockdown is how meaningful relationships formed in through screens and tools like Zoom really are. I've been astonished. I was used to doing a bit of video calling a year ago, but my working day now is spent in Zoom and Teams conference calls really. And I've really noticed how awkwardness in a room can actually extend over a Zoom conversation. And also a prayerfulness in a room, in a Christian context can also, you can have a very reverent context. The power of technology to transmit authentic human experience is extremely strong, especially when we have visual representations of one another, as well as audio. And I think that is a testament not only to the power of the technology, but to the power of our humanity and the immense capacity of human beings to communicate even when they're not physically present to each other. I think all of that is really good as long as it doesn't become then exploitative, addictive, one person exerting power over another.

Taylor Owen: Do you give sermons over Zoom or in virtual settings?

Steven Croft: Yeah, yeah. It's very hard to do because when you're preaching or leading a worship service with a group of people, you're constantly getting feedback from how people are responding and on a Zoom call often you're just looking at yourself reflected back if your face is spotlighted and that can be a bit off putting. It's challenging.

Taylor Owen: There was a conversation online the other day about whether the effect of seeing your own face in your Zoom screen has on the performative nature of your interactions. I think that really struck at something interesting that the way these tools are designed, changes how we interact as people. Turning it off makes us behave differently if we don't see ourselves.

Steven Croft: Yeah, it does. It does. And I find now that all audio calls are less rich as communication experiences.

Taylor Owen: It's so funny you say that. At the beginning of the pandemic, I was consistently asking people to do audio calls instead because I was getting so sick of Zoom, but now I find the opposite. Now I find when I speak to people just on voice, it feels like something's missing.

Steven Croft: Yeah, indeed. Yeah. Yeah. Yeah. I think in that sense, the technology has been fantastic at bringing the world together in different ways. And I think the thing I miss most in the Zoom meetings I have are the conversations you have on the way into the room and the way out of the room, where you get to know your colleagues as people, which all helps shape your understanding in reading their views. I'm sure it must be possible to do that more in technology or the way meetings are structured, but actually allowing us to bring more of the people we are into conversations is I think what enriches those meetings.

Taylor Owen: Yeah. And beyond the performative aspect of interjecting on Zoom. That is a very singular type of social interaction. It is not a complete whole of a person. Okay, I want to switch a bit to the policy conversation, because as you said, you have been involved in this. I recently spoke to Ron Deibert, an academic you probably know, who is suggesting that a policy framework might look radically different if we looked at some philosophic notions of classical liberalism, he would argue to be embedded in them rather than sort of a neo-liberalism or actual capitalist notion of technology, which really seems to drive even a lot of our policy solutions often. The idea of commodifying data and owning data is an ideological precept that in many ways shapes our policy conversation. I'm wondering if religion could provide us with a different framework? And do you think when you're in these policy rooms, in the House of Lords or in commissions or whatever it might be, do you think you provide a fundamentally different framework for thinking about governance?

Steven Croft: I think probably no, but that's because I'm sitting in a parliament within the United Kingdom and within a Western democracy where the fundamental values, which are broadly owned across our democracy about the best we can be, are actually consonant with Christian faith and a Christian moral worldview.

Taylor Owen: Can you just explain what you mean by that? Why is that connection closer in the UK than in other countries?

Steven Croft: Well, it may be the same in Canada and the United States, but the European democracies in particular flow in a very deep sense out of the values which are formed by the Christian faith in Europe over thousands of years. There isn't a sort of conflict at the level of the ideal. Increasingly I found that the call for ethics in the new technologies is not a call actually for a new kind of ethical framework, it's a call to apply the generally accepted ethical framework that's the best of our society. Human rights, respect for the individual, fairness, those kinds of principles and apply it across to the new technologies. It's not the generation of a new ethics, it's the consistent application of those ethical principles in technologies which are completely new and different. If I'm sitting in a room as I was yesterday, in a parliamentary context, I was sitting in a room with people from different political parties and none, looking at some legislation which is been developed in the UK around online harms. We were talking together about the huge range of different issues involved in online harms legislation. And there was absolutely no distinction between politicians of different parties or between people of different faith positions actually, on the need to protect children, the need to protect democracy, vast range of issues where we were not disagreeing about values, but there is a legitimate debate about how those values are carried forward into legislation. And there's sometimes a different balancing of free speech versus governance and control and in one or two other areas. I don't find myself as a Christian theologian or Christian minister in those meetings seeking to or needing to impose a set of values on people. But I do believe those values are of fundamental and prior importance. And sometimes my role within a conversation like that will be to draw people back to those first principles.

Taylor Owen: Some of that online harms conversation in the UK framed around this duty of care concept.

Steven Croft: It is, yeah.

Taylor Owen: That's always struck me as being more of a overarching framework and values based objective rather than some of the other policy conversations you see in the EU, for example, that are very technocratic and legislatively specific and legalized and bureaucratized almost. And duty of care is something different, isn't it?

Steven Croft: It is. Well, it's been articulated as a fundamental principle of the new online harms legislation the government is developing. And basically that the companies who are developing software and applications need to have a duty of care to those who use their services. And it's part of the answer to the question, in whose interest is this technology going to develop? And who controls it? Is it driven and controlled by those who are profiting from it? Or is it actually a service being offered for the good of the whole of society? In the course of which some people will find gainful employment and make money. And I think society faces a fairly fundamental choice between those two approaches and I want to argue kind of with every breath in my body that it should be the second of those really. And that technology which is exploitative should be governed and limited and controlled.

Taylor Owen: A couple more questions here before we close, but what is, sorry, I'm sure, you know Jeff Jarvis well, or know of his work and he has this frame pushing back against critics of tech. This is just another moral panic that there's a community of people who are up in arms and righteously fighting back against a perceived evil that actually is ultimately good for people in democracy and individual empowerment. How would you respond to that?

Steven Croft: I just disagree, I think really and I disagree because of the evidence of the harm that is being done by unrestrained technology. I don't know how things are in Canada in this, but we have a mental health crisis in this country among teenage girls, especially, but boys as well and there seems no reasonable doubt that that has been fed by the ungoverned use of technology and the way in which people are opening themselves out to others without personal boundaries at are critical moment with our identity formation. I think that's a real harm. I think we've sat and watched and wept at the events in the United States over the last few weeks where it's clear from this side of the Atlantic, at least, that a country has become divided because of the loss of common media and common acceptance of truth and a commons for debate. And that has been shaped in part and fed by technology companies, which have allowed the creation of bubbles in which different points of view can't be tested against each other. And so no, I think technology fundamentally changes the way human beings relate to each other and the balances of power are disturbed by new technologies and therefore they need to be critiqued and we need the public debate. It's not in my case, a caution about technology in and of itself. I enjoy technology. I enjoy what it can do, but it is a desire to see a balanced debate and good governance of that tank as it rolls forward.

Taylor Owen: Just to close, what would you say to the people who are building these technologies and to the people who lead these immensely powerful companies, the Jack Dorseys, Sheryl Sandbergs, Mark Zuckerbergs of the world, if you are speaking to them or even giving a sermon to them, how would you frame how they should be thinking about this?

Steven Croft: I think I would say the world needs a very big conversation about this and it cannot be a conversation simply between those who are developing the technology and the technical people. It needs to be a big moral and ethical conversation. And I see encouraging signs that some people do it. I think technology are absolutely aware of this. I was at a conference in the Vatican a year ago where the Roman Catholic church gathered people from all over the world, including Brad Smith of Microsoft and others. And he seemed to be very aware of these different ethical dilemmas and the need to develop a coherent ethics at the heart of technology. I do see some signs of hope, but I think the other thing I'd say to the group of people that you named just there is I think the world needs to see from them a greater awareness of potential harm that can come through any technology, not just their own and a desire to mitigate that as much as possible. And I know that slowly, I think, that message is being heard and different countermeasures are being taken and I'd want to encourage them very warmly.

Taylor Owen: Well, I hope they're listening and I hope they're listening to you and specifically on this and thank you so much for talking to us about it.

Steven Croft: Thank you. That's great.

Taylor Owen: That was my conversation with Steven Croft.

Big Tech is presented by the Centre for International Governance Innovation and produced by Antica Productions. Please consider subscribing on Apple Podcasts, Spotify or wherever you get your podcasts. We release new episodes on Thursdays every other week

For media inquiries, usage rights or other questions please contact CIGI.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.