New! Become A member
Subscribe to our newsletter

What Wendell Berry and COVID-19 Tell Us about AI

Over the last few months, BP (Before Pandemic), I’ve been asking myself “how do we Wendell Berry AI?”

To put Wendell Berry’s name next to AI would seem as much an oxymoron as “AI and Faith” does to many, because Berry is famously a minimalist when it comes to technology.  He is a sixth generation (on both sides) Kentucky farmer who does not own a computer or a cell phone. Since 1965, he has worked his farm, Lane’s Landing, with as little machinery as possible while writing almost fifty novels and books of poetry and essays.  At 84, Berry is a recognized national treasure, having been awarded the National Humanities Medal and inducted into the American Academy of Arts and Sciences.  He is well on his way from name/noun to verb, hence my question.  Wikipedia describes his occupation as “poet, farmer, writer, activist, academic” and summarizes his writing as “grounded in the notion that one’s work ought to be rooted in and responsive to one’s place.”

The experience of living this pandemic in the past month has suggested one answer to my question.  Like Berry’s writing, AI needs to be grounded in the reality of the particular, of actual living in a particular place.

Over the past month since I casually wrote in our March Newsletter of the interesting ways that AI could help to respond to COVID-19, a series of shocks to our way of life has manifested that we could not have imagined on March 1, and they don’t even count the much harder shock to those people and their loved ones who have been gravely sickened or died.  An abstract idea and a news story happening oceans away – a pandemic – has mushroomed here in Seattle and many other parts of the country into a crazy reality of closures, confinement, and economic loss on a scale we’ve never before experienced.  We see how the fabric of our society, culture and lives can be entirely upended while spring comes on in its natural beauty, flowers emerge from their bulbed bottoms, the cherry blossoms work their annual charm, and we are grateful for tree pollen and itchy eyes because that means our snuffling noses are not a symptom of COVID-19.

There’s my first lesson for our engagement with AI in the reality of this world:  natural life and processes go on, even as we search for control by grasping at data.

Throughout this huge change, we have been constantly asking what more is to come, and seeking answers in the details of data, as translated by computer models. This past week in Seattle, the epidemiological modelling of the Institute for Health Metrics and Evaluation at the University of Washington suggested a local answer – our Washington State spike in cases may occur by mid-April.  We are already grasping at signs that locally our “curve may be flattening” – a metric not in our vocabulary on March 1.   Nationally, our president’s wishful desire to exercise control translated initially into a proposal to “reopen the economy” at the expense of the only universal remedy in our current arsenal, social distancing.  I’m little better – after reading of the IHME prediction, my immediate impulse was to start calendaring off this model, mentally discounting its many caveats in order to take back an illusion of control over my life.

There’s the second lesson of the pandemic for our understanding of AI:  Data offers a sense of control that we grasp for, regardless of its quality.

The problem with all our models in the US, compared to those of countries like China, South Korea and Singapore that had experienced serious SARS-type viral epidemics, is that we lacked the foresight and discipline to quickly ramp up widescale testing.  Without that ability, we in the US have been reduced to measuring the progress of the disease by those who show up at the hospital – the equivalent of  the drunk who searches for his car keys under the streetlight because that’s the only place he can see.  And so on the front end of this pandemic, as cases grow, our understanding of what is happening on the ground has been much more limited than  in countries that successfully “flattened their curve” relatively quickly, even as our willingness to subject ourselves to the bitter medicine of “shelter in place” also has been limited by our strong predilection for individual liberty.

Lacking the necessary detailed and broad data on what is happening in the health of the entire populace, our national political debate has turned to an abstract  calculus driven by the broad scale economic pain of the pandemic:  how do we trade off economic loss and an abstract calculus of harm to human health from it against the specific numbers of COVID-19 cases turning up at the hospitals?  Models of the impact on human health from income inequality and poverty have suddenly become relevant to public policy around the pandemic as a means of asking (some say cynically) “how do we accomplish the greater good?”

Meanwhile at hospitals in Italy struggling against the weight of pandemic numbers, doctors began a similar weighing of the quality of life in the face of a shortage of respirators.  Here in the US, where such shortages had been – until this week – more feared than actually experienced, this triage-based conversation of whom to save with scarce resources merged with the broader economically-driven discussion of tradeoffs to produce suggestions for mandatory “do not resuscitate” orders.  That in turn prompted an appropriate backlash, including from faith leaders, e.g., who rightly asked:  how did we move so quickly and in an abstract fashion from a “can do” attitude of seeking to save all lives we could to a “triage in the abstract” willingness to write off the vulnerable and the aged in the interest of “the greater good.”

Wendell Berry has an answer for this flight to the abstract at the cost of the particular.  In 2010 he published a prose poem called Questionnaire in an issue of Yale Divinity School’s ethics magazine Reflections that was dedicated to Money and Morals After the Crash. Each entry of the Questionnaire asks what in particular do you really mean by a policy position.  Here is the fifth entry:

5. State briefly the ideas, ideals, or hopes,
the energy sources, the kinds of security,
for which you would kill a child.
Name, please, the children whom
you would be willing to kill.

There is the third lesson of this pandemic for AI:  It’s far simpler to make decisions at an abstract level than it is to deal with or wait for the details that may be revealed in personalized data.

This is reminiscent of our interest in substituting AI-powered decision making systems for human judgment on such vital life and liberty issues as the length of criminal sentences, access to parole or to social services, and convictions based on facial surveillance data as SUNY Professor Virginia Eubanks describes so compellingly in her 2018 book Automating Inequality. We want to view “data-driven,” algorithmically justified, decisions as more valid when in fact, they may very well be simply encapsulating systemic bias under a patina of neutrality.

As I came to understand why testing is so important with this virus, which freely spreads while asymptomatic, I also marveled at how public health officials have been able in the past to successfully control infections by persistent and ambitious identification of contacts and tracing through them the infectious path of a virus between specific human carriers.  That is daunting and difficult detective work, and yet it is the public health norm and expectation for what should happen when a new virus shows up.  An epidemiologist friend explained this to me and together we worked up an article for it that Christianity Today published on its website as guidance for faith congregations two weeks ago. It is only when the detective work of tracing viruses between individual carriers breaks down that you go from the green light of societal business as usual through the caution yellow light into the red light of “community transmission” – another term now fresh in common parlance.

In other words, the critical public health work to cut short an epidemic is deeply local.  Stopping an epidemic before it really takes off cannot succeed without a highly granular approach to the transmission of disease.  But a pandemic is by definition global.  It calls for a national policy to guide and support the local response.  And that’s where our current distrust of scientific expertise and government has especially failed us in this crisis.  Poor planning, a decentralized public health system, and emotional polarization have forfeited our opportunity to defeat this especially stealthy and communicable virus.

There is the fourth lesson that this pandemic teaches us about AI:  For all of our talk of Big Data and globalization, AI is primarily a search for the deeply particular.

Big Data promises the ability to answer questions at a cosmological scale, but what predictive analytics and pattern recognition are really doing is looking for needles of insight in an ever expanding data haystack.  It’s the ability to granularly sift for insights never previously available from data on such a grand scale that is the alluring promise of AI.

In a similar fashion we are learning that overcoming the global reach of a viral pandemic requires intensive individual, particularized effort and decisionmaking.  In a western liberal democracy like ours that has armed itself against the spread of this virus only with social distancing, it is up to millions of people to decide each day how much they will honor their government’s call to stay at home, to engage only in truly essential economic and social interactions, to forego such fundamental rights as speech, association, and corporate expression of religious beliefs. It is something of a triumph that so many of us have done so, given that this deprivation is based on epidemiological modelling and hard science of the same sort so many routinely reject around the even bigger but slower moving catastrophe of climate change.

In the much more authoritarian setting of China, the government was seeking to set neighbor against neighbor by encouraging reporting of quarantine violators and deploying the full power of its surveillance tools to enforce isolation.  But now we are learning how more benevolent, hyper localized networks of individuals organized at arms length to meet each other’s needs in ways that positively enabled Wuhan’s radical isolation and relatively rapid flattening.

Having missed the opportunity to avoid community transmission on the front end, it may be that the power of particularized data can still help America emerge from social isolation more rapidly and safer from a relapse.  The development of antigen tests that identify who actually has had the virus and thereby gained some measure of immunity can create a work force free to comingle in more normal manufacturing and social engagement.  Broadly deployed biometric testing like en masse temperature measurement coupled with identification technology can allow the individual profiling and deployment of non-vulnerable workers even if they have not had the virus.  Intensive testing of and more  particularized quarantining and tracing enabled and enforced by online personal data, location services data and surveillance software when the virus does appear, also could contribute to a safer and faster return to the workforce for many.

And as usual with AI, there is a considerable downside risk to the deployment of such tools. Not only is there is the strong potential for erosion of personal privacy and integrity in our personal health data embodied in HIPAA and other statutes.  We could also become a two-class society for a considerable period of time – for one large and favored group a “reopened economy” and for another – the immune-compromised, the older, those who have not had a documented exposure to the virus – an indefinite disqualification for economic and social “reentry” in the interest of maximizing economic recovery.  Such erosion may extend to the tools for effective enforcement when the contrast in opportunity between re-entered and excluded becomes stark.  These risks to privacy and equal opportunity, if they manifest, will arise as much from the deeply granular differentiation powered by AI as from the potential to avoid structural financial collapse.

Which takes us once again back to how our societal values are deeply stressed in this time of emergency and how we can avoid collapsing into strictly utilitarian decisionmaking in the name of the “greater good.”

That’s my last suggestion for what this pandemic teaches us about AI:  A major societal force like AI needs to be grounded in deep and long held beliefs and understanding of the unique value of human life.

So back to Wendell Berry and the people of his Port Royal novels, who give birth, live, work, love, worship and die in community.  We can only manage well in a time like this when we return to first principals about our equal worth as humans in relation to each other as neighbors, to moral systems that rise above the urgent demands of a moment, to appreciation for the self-sacrifice of front line health workers, restaurant servers, teachers, everyone who is foregoing the needs of life to bring about the greatest good of all – care for others by action and self-deprivation.  For many of us, these principals and the ability to act on them in the pressures of the moment are grounded and empowered in belief in a sovereign, all powerful, all loving Diety and recognition of the cosmological breadth of life.  Here again is that dance of the particular within the infinite, powered not by AI but someone far greater and kind.

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from - Youtube
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google
Consent to display content from - Spotify
Sound Cloud
Consent to display content from - Sound
Subscribe to our newsletter