A guest post by Mark Gilbert

In the parable of the Good Samaritan (Luke 10:25-37), two upstanding men, pillars of their community, cross the road to avoid a wounded man. In contrast, a hated outsider stops, takes pity on him, tends his wounds and pays for a hotel room.

It’s crazy, selfless and the loving thing to do. There would have been so many reasons for the Samaritan to cross the road, to distance himself – the wounded man could be faking it; he could be dangerous. “His people have been our enemies since they rebuilt their temple in Jerusalem five hundred years ago, and especially since they destroyed our temple on Mt Gerizim one hundred and fifty years ago! Surely someone else will look after him, and besides, isn’t that for his people, with their great big temple in Jerusalem, to do?”

Today we struggle with the same moral dilemmas. Jesus’ message that all people are neighbours has profoundly affected our society, not just professing Christians. And yet… faced with a hurt stranger, we often find ourselves on the other side of the road. How did we get there? Why did I cross that road?

Many reasons we give today are similar, but our society is more complex than first century Judea. The welfare state has institutionalised some help for the less fortunate. Taxes are automatically deducted from wages while job-seekers’ allowance, child benefits, carers allowance, and the state pension arrive in the accounts of those who need them. This changes the shape of the dilemma, and shifts the curve of the road where we meet the vulnerable and hurting. They are now dependent on the functioning of the state benefit system, of Universal Credit, and the Department for Work and Pensions (DWP) to keep body and soul together.

The social security system can only express some of our individual love for the poor if we make sure it actually delivers to the disabled, unemployed, elderly and long-term ill. Without our collective attention, it will fail. This danger is amplified by the use of digital technology. Over the past few years, many of the decisions made by humans, especially around fraud detection, have been automated with the use of statistical and machine learning algorithms. Although treated as having competence, these systems have often been found to perform poorly, and to act in prejudiced ways that would be unacceptable for a human agent.

In July 2016, the Australian social security agency Centrelink launched a new automated scheme that used Australian Tax Office annual income records to check the fortnightly income reported by benefits claimants [1]. This was hailed as an enormous triumph of modern data technology, with the government minister responsible claiming an error rate of only 1%. In fact, this was just the error for data entry, and the false positive rate for the algorithm was later estimated to be around 27%. By the time it was finally scrapped in 2020, following scandal and public outcry, there had been around 470,000 incorrectly issued notices, causing hardship and financial worry for thousands of people.  

Similar schemes have been used in the UK. According to the civil liberties group Big Brother Watch’s 2021 report Poverty Panopticon [2], UK authorities use fraud detection algorithms without properly addressing the dangers of using data that is highly correlated with protected characteristics (i.e. sex, age, ethnicity etc). The DWP has used a machine learning algorithm to screen benefits claims since 2022. If the algorithm flagged an applicant as a potential fraudster, it would suspend benefits payments. This automated pausing was stopped in January 2024 after “feedback from claimants and elected officials”. Despite the level of responsibility given to these automated processes, the department refuses to disclose the technical details of its algorithm, arguing that this would undermine its ability to tackle fraudsters [3].

This puts us in an ethically dangerous position. We are being asked to entrust new technologies with the needs of the most vulnerable. We are being asked to attribute agency and competence to algorithms, but without any meaningful transparency. If even one genuinely needy person is wrongly written off as a fraudster, then there is still room for improvement in our love for our neighbour. The Good Samaritan’s parable is uncompromising on that.

“Go and do likewise” (Luke 10:37). We can begin to imitate the Good Samaritan by noticing and by giving the gift of our attention. All three passers-by see the beaten victim, but only the Samaritan “takes pity on him” (Luke 10:33). We must see beyond the generalities and abstractions of data, risk profiles and demographics to the specific individual who is loved by God. This could mean prayerfully making an effort to know our cities, towns and communities better.

It could also mean learning more about how the algorithms work. The more we do this, the better we will understand them as tools that can be used for good. We can do this by reading books or watching videos, but this can be dry. As with art or sport, those of us who program generally prefer doing it to reading about it. If you’ve never done anything like this before, you could try Kaggle, an online community built around AI/ML tutorials and competitive challenges: https://www.kaggle.com/.

Crucially, these questions cannot be answered alone and must not be reserved for the narrow expertise of algorithmic development. There is a huge potential for collaboration here – but this will require stepping out of comfort zones and thoughtfully testing both Christian and non-Christian ideas. One good way to start would be to set up discussion groups to seek better Christian perspectives on these issues. We have a real opportunity to speak truth into the use of technologies that are both rapidly developing and will have enormous social consequences. If this is done well, then we may even find new ways to better love our neighbour.   

Mark Gilbert is a research mathematician in the energy industry, and has a doctorate in mathematical biology. He is particularly interested in how the Christian Gospel speaks to questions of ethics and technology.

_____________________________________

[1] Dennis Trewin, Nicholas Fisher, Noel Cressie, The Robodebt tragedy, Significance, 20, Issue 6, December 2023, pp. 18–21, https://doi.org/10.1093/jrssig/qmad092

[2] Big Brother Watch,  Poverty Panopticon, 2021 – https://bigbrotherwatch.org.uk/wp-content/uploads/2021/07/Poverty-Panopticon.pdf

[3] BBC News, Universal Credit claims no longer paused while AI fraud checks carried out, 2024 – https://www.bbc.co.uk/news/uk-politics-68030762.amp