New! Become A member
Subscribe to our newsletter

Consulting an Oracle: How Generative AI Reframes Idolatry

The Judeo-Christian tradition has an acute sensitivity for detecting idolatry. That is by design. As one reads the Hebrew Scriptures, no sin is more pervasive than the worship of false gods. From wandering in the desert to the diaspora, the nation of Israel often fell prey to this seductive counterfeit of true worship. Prophets and priests repeatedly spoke of its dangers and spared no words or great gestures to drive home their stern warnings.

Fast forward many centuries, this fixation with idolatry as a grave sin gets embedded into the European Christian tradition, becoming a hallmark paradigm in their encounters with other cultures. This partially explains why, as the Christian religion overtook the Western world, its pagan roots were all but eradicated.

With the advent of the Enlightenment and the scientific revolution, the enemy of the Christian faith started wearing different clothes. No longer was idolatry located in carved statues but in conceptual artifacts such as human progress, technology, and government. Because of that, Christians regardless of denominational background carry a healthy dosage of skepticism toward the new idol “du jour”.

AI moves into the creative sphere

After this brief historical excursus, we now move to AI. The growing adoption of AI algorithms in industry and governments in the last decade has filled our collective awareness with awe and fear. On the one hand, the fast development of these technologies, surprising even experts in the field, opens new worlds of possibilities that leave us dumbfounded. On the other hand, the potential for evil is also terrifying as we speculate on the level of destruction such tools can have if falling into the wrong hands.

This became even more evident with the popularization of generative AI applications such as OpenAI’s ChatGPT. This openly accessible interface took the world by storm last November and has only amplified the awe and fear described above. On the fear side, it shifted the conversation from a concern about automation to a panic over enhanced algorithmic performance on creative tasks. If our worst-case scenarios described an AI overlord that automated repetitive tasks, we now see these AI applications operating within the creative space in ways that, in some respects, exceed human capabilities. AI went from an obedient slave to an intelligent partner who can bring novel approaches to the mix.

While fear of automation persists, this is no longer the center of the debate. We must now contend with creative AI. This has also enlivened the conversation around AGI. Could creative thinking mean that AI is demonstrating emerging glimpses of self-awareness? Can it think for itself? Even so, in my humble opinion, dwelling on these questions misses the point of the challenges we face.

Redefining Idolatry

Before addressing the question of new forms of AI worship, redefining idolatry is in order. After all, what does it mean to venerate idols in our technological age? In a time where less and less of us pay obeisance to materials in a visible manner, idolatry as a concept needs an upgrade.

In plain English, idolatry simply means to attribute ultimate value where it does not belong. Another way to define it is to give undue affection to ordinary and temporal things. I like this definition because it opens avenues of understanding that transcend outward behaviors of honor. It is more about what we put our trust in and less about who or what we bow to.

In that case AI, like other technologies before it, presents a luring temptation to the morally inattentive individual. First, it is tempting to believe that AI can solve all our problems. While some companies may run their business model based on these assumptions, it is fair to say such a belief is not prevalent. Apart from a minority of secular transhumanists preaching the advent of the singularity, few of us have such blind faith in AI. In a recent Pew research report, 85% of Americans were either concerned or ambivalent about the prospect of AI’s impact on society.

AI Idolatry

Even so, the temptation for idolatry may be even more subtle. The power of large language models stems from their ability to further deepen the personalization of machines. That is, they make our interaction with them seem more and more like relationships to another individual. Instead of getting a list of possible links from a Google search, we can simply ask a question and get an answer that reads as a human-like response. Furthermore, an answer eloquently tuned by massive algorithms can make even the most absurd idea sound true.

Combined with the vice of acedia (laziness), this idolatry can foster an unhealthy dependence on devices with far-reaching consequences. One noteworthy consequence is an overreliance on generative AI tools that think for us. As one takes on answers at face value, they outsource critical thinking to an unreliable model that is prone to confabulations.1

It could also lead to the replacement of real human relationships with ones facilitated by commoditized agents maximized to extract profit and information out of unsuspecting customers. As generative AI powers companion chatbots, many can start replacing friendship or even romantic relationships with a large language model. Certainly the latter already happens through cheap substitutes like pornography, excessive gaming, and social media. However with generative AI, the allure and relationality becomes much stronger.

Finally, as AI advances inch closer to a semblance of Artificial General Intelligence, it is not far-fetched to envision full religions based on AI. This would take the form of setting up AI idols for worship, counsel, and guidance. While some may find this too far-fetched, such a movement already exists even with the limited technology we have now. One can only imagine the possibilities for expansion with the rapid development of AI, and shudder.

False Security

By now, we religious ones may pat ourselves on the back and high-five each other for seeing through this deception. We can sit comfortably in our ideological bubbles bemoaning a lost world falling into yet another trap while we hold fast to timeless truths. If I ended the article here, that may be a fair conclusion to draw on this topic.

Nevertheless, the advent of emerging technologies like AI also exposes another type of idolatry, one that religious people have fallen victim to for thousands of years: the idolatry of certainty and pre-established worldviews. In a world of dizzying change, like faithful Pharisees, it is a natural reaction to cling to the familiar. Doing so may offer momentary comfort while also feeding our self-righteous pride. Yet that itself is a form of idolatry.

For example, religious leaders are often keen to defend the uniqueness and superiority of humanity over any machine’s emulation of human ability. Some even want to ensure we don’t mistake AI’s creative powers with divine creation. The issue is not one of accuracy, but of the underlying perceived threat to fixed ideas about God and humanity. The rigid clinging to these perspectives, instead of seeing new technologies as invitations to reconsider, is what I consider the idolatry of tradition. In other words, it is more important to uphold and defend a belief than to face new evidence with an open mind.

Knee-jerk reactions will only distract us from the very important work of stewarding these technologies into upholding human flourishing. Therefore, at the dawn of a new global challenge that can transform human societies, let us engage with humble hearts and open minds. What is there for us to learn? What parts of our story must we let go to open space for the new? What dangers lie ahead but also what possibilities will these new developments bring?

Will we abdicate our responsibility to building a just future, or will we rise to the challenge with the courage of the prophets and saints of the past?

  1. While “hallucinations” has been in the popular lexicon for this, there is an effort among AI experts to change it to “confabulations”. This is for two reasons. First, the term “hallucination” may be seen as stigmatizing to those with mental health issues such as schizophrenia. Second, “hallucinations” can be seen to imply a consciousness that is prone to wander, which might be considered anthropomorphization.

Elias Kruger

Is a Quantitative Analytics Manager and VP at Wells Fargo Bank in Atlanta, and the Founder of AI and Theology which seeks to apply a thoughtful Christian lens to the promise and peril of Artificial Intelligence. Elias holds a Masters of Theology degree from Fuller Seminary, and an MBA from Regent University. 

1 Comment
  • Marianna
    9:49 PM, 12 December 2023

    Dear Mr Krugar,
    Having considered the history of secular religion, I am skeptically inclined toward the attempt at it. AGI, will take us toward another try at spirituality without the hard parts of searching, prayer, repentance, fasting. We will be consulting it for an analysis of what Christ might think about some topic, as a computerized concordance, without the need to develop an “ear” for Christ’s teachings. We have such shortcut sites now. Eventually, some young AGI programmer will build himself a Christ robot, and say, “Behold your god!” Behold an antichrist, instead. Just sayin’.
    Thanks for your thoughts on this topic.

Leave a Reply

Your email address will not be published. Required fields are marked *

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from - Youtube
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google
Consent to display content from - Spotify
Sound Cloud
Consent to display content from - Sound
Subscribe to our newsletter