In the novel 2034, Commodore Sarah Hunt watches helplessly as Chinese hackers cripple American naval assets across the Pacific. The USS John Paul Jones burns in the South China Sea while autonomous systems turn against their operators. In Ghost Fleet, commanders orchestrate drone swarms with mathematical precision, their decisions compressed into microseconds of algorithmic calculation.1 Both novels paint vivid pictures of future warfare, though something crucial is missing from their pages.
There are no chaplains.
No one offers prayers for the dying. No one helps commanders wrestle with the moral weight of their choices. No spiritual advisor serves as a bridge between human conscience and machine calculation. This absence reveals our assumptions about technological progress: that advancement necessarily means leaving behind the spiritual dimensions of warfare.
As a Navy chaplain who has served alongside both cutting-edge technology and ancient human struggles, I find this omission troubling. The marginalization of spiritual guidance at precisely the moment when machines begin making life-and-death decisions represents a fundamental misunderstanding of what preserves humanity in combat.
The Speed of Silicon, The Pace of Souls
Modern command centers already showcase our algorithmic future. Massive displays show satellite feeds, drone footage, and sensor data from worldwide operations. Artificial intelligences (AIs) identify targets, track submarines, coordinate supply chains, and predict weather patterns with unprecedented accuracy. The Pentagon’s Joint All-Domain Command and Control program represents billions invested in connecting every military asset through machine intelligence.
These systems promise what military strategists call “decision superiority.” This superiority is the ability to observe, orient, decide, and act faster than any adversary. Sun Tzu wrote that speed is the essence of war, yet when choices unfold faster than human thought, where does conscience fit?2
Throughout my naval career, I have witnessed the tension between operational tempo and moral reflection. A Marine once described his experience operating a semi-autonomous weapons system: “The computer identified the target and calculated threat levels. I made the final decision to engage and that choice haunts me. The system moved on and I was left remembering faces.”
This compression of moral decision-making into machine timeframes creates a temporal displacement, a separation of action from its ethical processing. Ancient Greeks distinguished between measured time (chronos) and the right time for action (kairos). Moral reasoning requires kairos, moments when wisdom can emerge from experience and the full human significance of actions becomes clear.
Sacred Traditions and Lethal Decisions
Every major faith tradition recognizes that taking human life ranks among the most morally significant acts possible. This recognition transcends religious boundaries because it addresses fundamental questions of human dignity and responsibility.
Buddhist teachings emphasize the interconnectedness of all beings through compassion (karuna). When Buddhist service members kill in battle, they carry that action’s karmic weight throughout their lives, understanding that harming another inevitably harms oneself.3 The act creates a permanent alteration in the web of existence.
Jewish ethical thought places life preservation (pikuach nefesh) at its center while acknowledging that sometimes preserving life requires taking life. Rabbinic tradition speaks of an integrated knowledge that combines information with experience, law with empathy, universal principles with contextual awareness (da’at).4 Algorithms cannot achieve da’at because this understanding emerges only from lived human experience of vulnerability and mortality.
Christian theology affirms that humans bear unique moral capacities and responsibilities as image-bearers of God. The Vatican’s recent document Antiqua et Nova explicitly states that moral agency cannot be separated from human consciousness.5 Christians who engage in warfare must fully own their lethal choices, recognizing that even justified violence remains tragic and necessary killing still demands spiritual processing.6
Islamic jurisprudence emphasizes balance between divine attributes of justice (’adl) and mercy (rahma). Warriors under Islamic law must maintain capacity for mercy and acceptance of surrender.7 Autonomous weapons systems cannot exercise mercy because true comprehension of suffering exceeds computational capabilities.
These theological frameworks share the crucial insight that the transformation that occurs when one takes life cannot be delegated to machines. The responsibility and the spiritual processing that accompanies it remains irreducibly human.
The Scattered Responsibility Problem
Consider this scenario increasingly discussed among military ethicists. An autonomous drone identifies what its algorithms determine to be an enemy formation and engages without human authorization. Later analysis reveals the target was a wedding celebration. Thirty civilians are dead.
Who bears responsibility? The programmer who encoded the targeting parameters? The commander who deployed the autonomous system? The general who authorized such operations? The defense contractor who manufactured the platform?
Autonomous weapons systems introduce unprecedented challenges to military ethics by distributing moral responsibility across multiple actors separated by time and space. This diffusion creates what philosophers term the “responsibility gap,” a situation where harm occurs even though no individual can be held fully accountable.8
Research by Jonathan Alexander, a Navy chaplain with combat experience, explores how this gap affects those who work alongside autonomous systems. His concept of “moral enmeshment” describes the psychological reality that operators feel deeply responsible for outcomes they cannot directly control.9 Technology displaces moral complexity temporally and psychologically.
Chaplains as Moral Infrastructure
Military chaplains provide a moral infrastructure, the human relationships and institutional practices that sustain ethical reflection within military organizations. Research demonstrates that units with embedded chaplains experience significantly lower rates of moral injury among veterans.10
This protective effect stems from chaplains creating spaces for reflection where service members can step back from immediate tactical pressures to examine larger purposes and meanings.11 In my experience conducting moral injury assessments, I have found that service members often struggle with their inability to process those actions spiritually and ethically, and not necessarily the correctness of their actions.
The chaplain’s presence serves multiple functions simultaneously. Through this institutional role, military organizations acknowledge that personnel possess moral dignity extending far beyond their operational duties. Such spiritual counsel equips service members with tools for navigating the moral complexities inherent in warfare while preserving connections to enduring traditions that transcend immediate tactical concerns.
When organizations lack chaplains, these functions typically disappear under operational pressures. Service members become isolated with their moral struggles. Leaders operate without spiritual counsel when making consequential decisions. The organization loses capacity for moral reflection, becoming instrumentally focused while losing sight of larger human purposes.
Designing Ethics into Military AIs
The development of artificial intelligences for military applications demands ethical input from the earliest design phases. This requires bringing chaplains into conversation with engineers, philosophers with programmers, and ethicists with operators. Interdisciplinary collaboration becomes essential for creating systems that enhance rather than diminish moral agency.
Military AIs must incorporate “ethical pauses” or “inflection points,” a mandatory reflection period built into autonomous operations.12 These pauses allow moral reasoning to continue within high-speed operational environments. Military doctrine must preserve meaningful human control over lethal decisions, maintaining human moral agency as supreme in life-and-death choices.
Alexander’s research on moral enmeshment suggests that operators will experience moral weight regardless of automation levels. While service members may not process this burden immediately, they inevitably will reflect on their wartime service after combat ends. Technology does not solve moral complexity; it only moves it to different temporal locations.
Training for algorithmic warfare must extend beyond technical instruction to include moral reasoning skills and the spiritual dimensions of military service through community-based formation. Traditional rule-based ethics training proves insufficient for the novel moral landscapes created by human-machine collaboration.
The Cost of Efficient Victory
The absence of chaplains in our speculative fiction reveals uncomfortable truths about contemporary assumptions. These narratives show characters achieving tactical success while losing essential human qualities. The missing spiritual advisors represent our collective belief that technological progress makes spiritual guidance obsolete.
This assumption misunderstands both technology and human nature. More powerful tools create greater need for wisdom about their proper use. Complex systems increase rather than decrease requirements for communities to process their implications and make human values more important.
Military AIs systems developed without conscience will create forms of warfare that excel tactically and collapse morally. Decision superiority achieved through technological advancement cannot come at the expense of human character.
Preserving Humanity in the Machine Age
The empty spaces where chaplains should stand in our visions of future warfare represent tears in the moral fabric of military institutions. History provides sobering examples of what happens when military operations lack adequate moral infrastructure. The Abu Ghraib scandal demonstrated how the absence of chaplains contributed to systematic moral failures among guards.13 Unless we actively preserve conscience within our most advanced military systems, we will suffer deeper damage to our moral infrastructure.
We face essential choices about the direction warfare will take. We can develop weapons without moral capacity, or we can maintain our most sophisticated technology as instruments operated by morally aware human beings, or we can design AIs that actively enhance human moral reasoning rather than replacing it. This choice will determine whether future victories preserve human values or whether tactical success must be purchased through moral bankruptcy.
Machines excel at processing data with unprecedented speed and accuracy. Yet humans alone pray for the dead, comfort the living, and provide moral guidance to warriors facing complex ethical dilemmas. This work requires the irreplaceable human capacity for spiritual presence and moral reflection.
The future of warfare depends on ensuring that our most advanced technologies serve the moral agency that makes us human. In that future, chaplains stand as guardians of what we cannot afford to lose. They preserve the irreducible human capacity for moral reflection that transforms warriors into moral agents, choices into conscience, tactical success into meaningful victory, and mechanical efficiency into human purpose. Otherwise, we inherit the spiritually barren battlefields of 2034 and Ghost Fleet, where technological mastery reigns over human conscience. We must choose the path of wisdom.
References
- Elliot Ackerman and James Stavridis, 2034: A Novel of the Next World War (New York: Penguin Press, 2021); P. W. Singer and August Cole, Ghost Fleet: A Novel of the Next World War (New York: Houghton Mifflin Harcourt, 2015).
- Sun Tzu, The Art of War, trans. Samuel B. Griffith (London: Oxford University Press, 1971), 134.
- Sharon Salzberg, Loving-Kindness: The Revolutionary Art of Happiness (Boston: Shambhala Publications, 1995), 102–110.
- “Pikuach Nefesh,” Jewish Virtual Library, accessed June 25, 2025, https://www.jewishvirtuallibrary.org/pikuach-nefesh; “Da’at,” Chabad.org, accessed June 25, 2025, https://www.chabad.org/library/article_cdo/aid/299648/jewish/Daat.htm.
- Dicastery for the Doctrine of the Faith, “Antiqua et Nova: Note on the Relationship Between Artificial intelligences and Human Intelligence” (Vatican, 2025), §39, §117.
- Not all Christian traditions would agree with this just war framework. As Marc LiVecche notes in The Good Kill (Oxford University Press, 2021), 23, grief differs from guilt. Someone who kills in justified war is not “guilty” even though they may experience grief at taking human life. When we embrace the Niebuhrian paradox that “all killing is wrong, yet in war it is necessary,” we inevitably set service members up for moral injury by telling them they have done something “wrong.”
- Sohail H. Hashmi, “Islamic Ethics and Weapons of Mass Destruction,” in Ethics and Weapons of Mass Destruction, eds. Sohail H. Hashmi and Steven P. Lee (Cambridge: Cambridge University Press, 2023), 321–352.
- Robert Sparrow, “Robots and Respect: Assessing the Case Against Autonomous Weapon Systems,” Ethics & International Affairs 30, no. 1 (2016): 93–116.
- Jonathan W. Alexander, “Lethal Autonomous Weapon Systems and the Potential of Moral Injury” (PhD diss., Salve Regina University, Newport, RI, 2024), ProQuest Dissertations & Theses Global.
- Jason A. Nieuwsma et al., “Chaplaincy and Mental Health in the Department of Veterans Affairs and Department of Defense,” Journal of Health Care Chaplaincy 19, no. 1 (2013): 45–74.
- Cynthia L. G. Kane, “The United States Military Chaplaincies’ Missing Peace” (Doctoral dissertation, Wesley Theological Seminary, 2022), 52–68.
- I originally termed these mandatory pauses “creative disruption” in my doctoral work yet have since adopted “ethical pauses” or “inflection points,” following Wendell Wallach’s usage in A Dangerous Master (New York: Basic Books, 2015), 10. “Meaningful Human Control” is another technical term regarding our posture on the continuum of autonomy. For a primer see: https://www.cnas.org/publications/reports/meaningful-human-control-in-weapon-systems-a-primer.
- Stephen Mansfield, “’Ministry by Presence’ and the Difference It Made at Abu Ghraib,” HuffPost, October 10, 2012, accessed June 25, 2025, https://www.huffpost.com/entry/ministry-by-presence-and_b_1912398. The U.S. military’s response to Abu Ghraib included increasing chaplain presence at facilities like Guantanamo Bay, where I served in 2004–2005 as part of this institutional recognition that moral guidance prevents rather than merely responds to ethical breakdowns.
Views and opinions expressed by authors and editors are their own and do not necessarily reflect the view of AI and Faith or any of its leadership.