I had never been to South Bend before, and one of the first places to which I was directed when I arrive was the outside prayer grotto, just behind the main chapel: a place of prayer, devotion, and meditation. When I finally made it there on a run in the morning hours before sunrise, it was illuminated by dozens of candles as signs of multifold prayers. It seemed an appropriate symbol: the conference similarly produced several rays of illumination on the integration of artificial intelligence and Christian faith as I was just about to begin my role as executive director of AI & Faith.
Indeed, at the Notre Dame Summit on AI, Faith, and Human Flourishing, the air was electric and urgent, and the speakers proclaimed an unmistakable concern that Christians had missed influencing social media, and that we’d better not miss that opportunity with AI. It was a true opportunity, a kairos moment. “Every human being possesses inherent worth because made in the image of God,” declared philosopher Meghan Sullivan (at least according to my notes) as she unveiled the DELTA Framework: Dignity, Embodiment, Love, Transcendence, and Agency. Since the DELTA framework has been formulated by Sullivan, it was no surprise that she kicked off the program. (Technically, there was mass at University’s Basilica of the Sacred Heart and a dinner the night before overlooking what some might see as the other cathedral at Notre Dame, the football stadium.) DELTA embodies a Thomistic-Aristotelean virtue ethics that affirms human worth, underlines our embodied and relational nature, anchor these ethics in love (which—side note—I believe organizes all the other four values), orients innovation toward transcendent truth, and promotes human conscience and moral freedom.
The Notre Dame conference raises a cry. There is an urgent need for people of faith—almost every speaker emphasized—to engage in shaping the moral conversation around artificial intelligence. AI’s accelerating power needs to be matched by a moral and spiritual response. The key challenge was that, while Christians played a strong role in the ethical debates of the 20th century, they have been largely absent from 21st-century tech ethics. And while Christians did not influence social media adequately (“we missed the boat on this one,” was a common sentiment), we cannot similarly miss the roll out of AI. Speakers urged renewed participation grounded in the DELTA Framework, that honor inherent human worth, relational embodiment, and moral responsibility. My notes recorded this from Sullivan: “We cannot slow down this technology, but we can only speed up and deepen the conversation.”
No one at the conference (that I heard at least) questioned the power of AI to reshape key institutions like universities, businesses, community groups, churches, but the question before the church is to what degree it will shape this influence. AI certainly promotes knowledge (in a certain fashion), but it lacks wisdom. Around this time, I was listening to Cornel West, who declared (again, from my notes), “I tell my students all the time—‘don’t try to be the smartest person in the room; seek wisdom.’” LLMs might be able to predict the next word with astonishing speed and clarity, but it isn’t wise enough to discern what is true or good.
In view of space, let me highlight two other presentations. In his talk “The Role of AI in Our Faith, Families, and Worldview”, Andy Crouch argued that “there is no instant, effortless love”, urging restraint and rhythms of silence to preserve human depth amid technological enchantment. “GenAI and Youth,” presented by Ron Ivey and Alan Marty, warned about the effects on youth of emerging technologies—that automation and generative AI risk hollowing out meaning, community, and character formation. It’s worth noting that this fact isn’t lost on youth. A recent survey by Gallup and the Walton Family Foundation found that 49% of Gen Zers surveyed are worried AI will corrode their ability to think critically, and 41% say generative AI tools make them anxious. The effect of AI on youth and children is not all good news, and yet these disruptions also invite renewed reflection on human dignity and ethical response.
If it’s possible for me to discern, something of a shared conviction emerged: technology must serve human and spiritual flourishing, not define it. I often rephrase Jesus’s words in Mark 2:27 as follows “Tech was made for human beings, not human beings for tech.” Christian leaders, scholars, and technologists must enter the public dialogue with humility, hope, and wisdom, seeking to steward the development of AI that includes moral responsibility and the sacred dignity of human life.
There is much more to say, of course, but I’ll close here. I was happy to be there (I was a late registrant), and thus to see in what way we at AI & Faith can pick up the gauntlet that this conference—and the phenomenon of AI today—has laid down. It’s no small challenge, but one that I’m thankful that our community of AI and Faith is responding to.
Views and opinions expressed by authors and editors are their own and do not necessarily reflect the view of AI and Faith or any of its leadership.


