New! Become A member
Subscribe to our newsletter
Insights

Defending Privacy: How to Love Your Neighbour in the Digital Age

The Online Safety Bill is supposed to be the UK’s landmark legislation for making the internet safer for citizens. It’s an ambitious but difficult project, as tricky as it is necessary. The recent controversies, around the proposals to remove the reference to so-called “legal but harmful” content and, more recently, to make executives of big tech companies liable for breaching their duty of care to children, bear this out in spades. Some celebrate a tougher approach that would protect the vulnerable and detoxify the internet. Others fear arbitrary censure, the erosion of free speech, and all the negative repercussions that come with it.

The legislation follows and seeks to address increasingly well-known problems with digital technologies and platforms powered by algorithms and big data. Indeed, it’s becoming old news by now: we’re constantly tracked and profiled by powerful AIs working on behalf of private companies (though profiling and algorithmic decision-making are also on the increase in the public sector). Our data – about our internet habits, purchase histories, romantic preferences, job performance – every trace of our lives as we interact with digital technology, are hoovered up and analyzed to generate profiles and predictions about us. The apps that we use have become apps that use us, without regard to the vulnerability of… well, all of us, but particularly the young.

Many concerns have been raised about this system in recent years: declining teen mental health caused by “performative social media” (e.g. Instagram and TikTok), the fraying of our democracies through filter bubbles and conspiracy theories going viral, injustices against minority groups through the use of data-driven policing and facial recognition tech, and so much more.

Invasion of privacy almost always comes up. Indeed, privacy has come to act as a point of convergence between tech critics, regulators, and other stakeholders within civil society concerned with the data economy. But what is privacy anyway, and why does it matter (‘if I have nothing to hide’)?

Most often, privacy is treated as an individual’s right to control what information about them is collected and analyzed, by whom, and on what terms. And there is much to be said about this view when so much control has been lost to data-hungry corporations. But privacy – in this narrow, individualist, rights-focused sense – is not enough to address the wider moral risks of emerging tech.

Privacy understood as our right and ability to control our data wrongly assumes that all we reveal about ourselves is the result of deliberate or conscious acts. This is often not the case. Our data disclosures can be manipulated through things like web and app design, unwieldy terms and conditions, and default settings that encourage maximum data collection. Deeper still, an emphasis on control places impossible demands on us. It unwittingly overwhelms our agency just as it tries to uphold it. If data is used against us, or even merely to profit from us, it then seems like it’s somehow our fault or at least a process to which we have consented to.

It’s also not merely that we’ve lost control over our data. In fact, much of the information derived, and predictions made about us are based on seemingly anodyne metadata (data about data) that we wouldn’t even recognize as ours in any meaningful sense. Rather, the key issue is that the whole system treats us as less than human. What are we seen as? digital livestock: farmed for our money, milked for our attention, our relationships and connections harvested for profit, or for government power.

If we are to resist such dehumanizing uses of technology, privacy must be reimagined and placed on a truer, more robust anthropological footing. To cut to the chase, privacy is about our dignity as being precisely the sort of creatures that we are: embodied, with limitations and susceptibilities to be honored rather than violated for gain; relational, made for relationships of trust and mutual care rather than exploitation; agential, with a capacity for intentional action to be upheld rather than undermined.

This wider anthropological foundation allows us to see more clearly that privacy does not simply have to do with what others (e.g. advertisers or government agencies) know about me – an individual concern – but also with the systems that my data feeds into, the consequences of which apply not simply to me but to more vulnerable others. Rooted in collective dignity rather than individual rights, privacy remains an important way to think about what is at stake and begin to address the challenges facing us as individuals in the digital age, but most importantly our vulnerable digital neighbours.

So what is to be done? Responses must be both individual and collective, regulatory and entrepreneurial. As individuals, we should reject default settings on websites, apps, and devices, that seek maximal data collection and opt for privacy-conscious technologies and services, including privacy-friendly search engines (e.g. DuckDuckGo, Brave) and email providers (e.g. ProtonMail). Regulatory efforts to curb the extractive business models of today’s technology giants, at both national and supra-national levels, should be encouraged. But regulation is often outrun by technology. What we need are entrepreneurial responses to the status quo; alternative business models and innovation that place human flourishing and dignity at their centre. To this end, theological anthropology should be considered as a rich source of wisdom to draw on, especially if what is sought is a rounded, rather than a reductionist, realistic, rather than idealized, understanding of the human person.

Privacy is not dead, nor should we allow it to die. Protecting privacy is how we love our neighbor in the digital age.

Dr Nathan Mladin

For a detailed discussion of this topic see Data and Dignity: Why Privacy Matters in the Digital Age, the new research from Theos think tank.

A version of this article first appeared on Theos think tank. Used with permission.

Ari Ezra Waldman, Privacy as Trust, Privacy as Trust: Information Privacy

for an Information Age (Cambridge, UK: Cambridge University Press, 2018), pp. 32-33.

Adrian Shahbaz and Allie Funk, “Freedom On the Net 2021: The Global Drive to Control Big Tech”, Freedom House, https://freedomhouse.org/report/freedom-net/2021/global-drive-control-big-tech#Regulation. See also the UNESCO “Recommendation on the Ethics of Artificial Intelligence”, which is the first global normative framework on AI ethics: https://unesdoc.unesco.org/ark:/48223/pf0000379920#page=14. See ePrivacy Directive, Artificial Intelligence Act, the Digital Services Act, the Digital Markets Act, and the Data Governance Act. The UK is set to release the final draft of its own Online Safety Bill, and the Office for Artificial Intelligence is expected to release a white paper on governing and regulating AI.

Nathan is Senior Researcher at Theos think tank, in London. He is the author of several publications, including Data and Dignity: Why Privacy Matters in the Digital Age (Theos, 2023), and has previously written on surveillance and privacy in The Robot Will See You Now: Artificial Intelligence and the Christian Faith edited by John Wyatt and Stephen Williams (SPCK, 2021). Nathan holds a Ph.D. in Systematic Theology from Queen’s

University Belfast.

Leave a Reply

Your email address will not be published. Required fields are marked *

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google
Spotify
Consent to display content from - Spotify
Sound Cloud
Consent to display content from - Sound
Subscribe to our newsletter