Dr. Philip Butler is Assistant Professor of Theology and Black Posthuman Artificial Intelligence Systems at Iliff School of Theology in Denver, Colorado. Professor Butler is also a panelist for the program “AI and Religion in 2041” organized by AI&F for the AI Research Seminar at the American Academy of Religion’s Annual Conference in Denver this coming November. Professor Butler’s cutting-edge scholarship combines Black liberation theologies, neuroscience, spirituality and technology, particularly artificial intelligence. He is also the founder of The Seekr Project, which is a distinctly Black conversational artificial intelligence with mental health capacities, combining machine learning and psychotherapeutic systems.
Thank you to Dr. Butler and to Haley Greise, the newest member of AI&F’s Editorial Team, for this interview as a lead-in to our contribution to the AAR’s Annual Meeting.
Haley: Dr. Butler, thank you so much for joining me today. I would love to start with a bit about your work, how you landed in your current area of inquiry, and what specifically drove you into questions around technology.
Dr. Butler: Sure. So, I like to go to the movies early to see the previews. I don’t remember the movie I went to, but there was a preview before the previews where a documentary came on, and the documentary was on science fiction and its influence on the future. One of the questions posed within the documentary was: What would the world look like in 80 years? And sitting there in the movie theater, I didn’t know the answer to that, but I knew that I wanted to be a part of it. And so that kind of question sat with me. I tell my students that my goal is to ask you questions that never leave you. And that’s one of those questions — it stayed there.
When I went to embark on my doctoral work, my initial interest was in neuroscience and spirituality. But I also did coursework during the time of Trayvon Martin and Freddie Gray and Sandra Bland – their names just continue to roll on. And so, I had to rethink what my scholarship was going to look like, the trajectory of my own research and what I was doing, because I was not doing this in a vacuum. I was doing this with people at home and people that cared about me, people I cared about, the people that claimed me, so what was the influence and the intention of my work as a kind of movement forward?
[I realized] that if I did not like the reality that I was a part of, I had to be intentional about creating a new one. And that’s part of where technology came in — technology and Blackness specifically. Now, when I introduce myself, [I say] my work is neuroscience, technology, spirituality and Blackness. This is how these four intersections came to be. [Also,] this is where I initially formed my interest in transhumanism, which subsequently led me to post- and meta- humanism and even anti-humanism. But that’s a whole other conversation.
Haley: Relating to the transformation of your doctoral work, could you speak on being Black in a technologically driven culture and the opportunities or risks that you see emergent technologies generating?
Dr. Butler: Sure, I wrote about this in my dissertation and in my first book, Black Transhuman Liberation Theology. The idea is more so that technological innovation has been happening all around us for centuries. The perception of what technology “is” and what it “isn’t” is not only changing, but it’s also important that it is undergoing this change. I speak about this in terms of race, maybe because I think there’s this idea — there’s a conception of a telephone, especially a smartphone and maybe a smart television and teleportation, you know — versus something like rice or any cultivation of agricultural systems like that or even a pacemaker. My grandmother, who recently passed 2018, wore a pacemaker. And so without knowing it, she was transhuman because she had a life extension device that helped to keep her around for ten years. I’m very grateful for the time I got to spend with her.
Part of the goal here is not only to reinvigorate or help add to the invigoration of Black imagination around technology, but also think about the ways in which technology becomes a liberative tool in the transformation of the reality that we have, so that we – Black folks – can arrive somewhere different knowing not only that technology is a good thing, but it’s something that we cannot avoid.
Haley: You use the terms posthumanism, transhumanism, and metahumanism. Would you be able to give a brief definition of each and specifically what your interest is in those areas?
Dr. Butler: Transhumanism — I’ll take an abbreviated version of what Nick Bostrom said — I’d describe as any augmentation of intellectual, psychological and physical capabilities. But then, depending on who you’re talking to, there’s also maybe an interest in the augmentation of spiritual capabilities.
Posthumanism is, in a very simple way, a de-centering of humans or “the human.” Historically, humanism was talking about an Enlightenment project human, which is white male — so de-centering white males. And then looking at nature as this space where everyone is a part of this larger ecosystem. This is where you start having animal rights and emphasis on all things ecological, not only rights, but also sustainability movements and things like that within this posthuman space, in addition to this pushback against speciesism. So they’re suggesting that every lifeform or organic life on Earth, is a form of an animal, and as such ought to recognize its place within the larger system of nature.
Metahumanism is…you can think of it like this if you’ve seen Dr. Strange in the Multiverse of Madness. Each time he and this young lady, America, travel through these multiple dimensions at one time they’re like bubbles. So, think about the bubbles. If you translated the bubbles as concepts, then the idea of a metahuman is more of a conceptual understanding of what one is. And so, a body is a site for meaning; the body itself is not necessarily a thing to be understood in a concrete or definite sense. It is something that carries meaning. And as such it can also carry multiple meanings simultaneously, which also suggests it could be a host for multiple sites of perspectives that are both housed internally and externally at the same time.
My foray into this was this initial use of technology to materialize liberating realities for Black people, recognizing that posthumanism becomes the undergirding tenet. However, for my own personal sense, Black posthumanism was the underlying tenet. So, I initially credit the nine tenets of Black posthumanism, and that’s in the intro to my first text. That introduction outlines the basic framework that allows Black transhumanism to take place — because transhumanism is an immediate extension of the Enlightenment project whereby progress over everything [takes place] and it doesn’t matter who’s left in the wake of this progress. And so, it allows for the racial stratification and the colonial harm that has taken place historically to be not only justified but perpetuated in its ongoing fashion. Again, talking about technological innovation is the most important thing.
Within this Black posthuman space — being in conversation with Blackness and Black people and really attempting to wrestle with this concept of what technology is — starts with the idea that our bodies are technology. All of the ten biological systems become their own kind of subsystems that allow for other features to emerge, whether it be personality, identity, sexuality, so on and so forth. And so, this Black posthuman space becomes the foundation, for it allows things to be nonlinear, things to be paradoxical and intentionally so, not in a random sense, but in a dynamic and a complex sense, which the binaristic chokehold of humanism and transhumanism attempts to maintain in order to keep order. When you look at experiments, they normally have two variables, not many more than that, because you can control smaller amounts of variables. The idea is to add a multitude of variables to increase the level of disorder and complexity to, again, shake up and produce what becomes this infinite or trans-infinite amount of potentiality in terms of what embodiment in action and realities can produce.
Haley: So, with your focus on Black liberation and AI, how do religion and spirituality intersect with technology?
Dr. Butler: You’ll see this in an upcoming edition of The Black Scholar. I initially defined spirituality as one’s connection with God or that which is outside of self. Upon further reflection, for me, spirituality is anything that you do on a regular basis. People can say they believe anything, but I think looking at people’s actions determines something. What we do on a regular basis suggests that we believe something other than what we profess to believe. And so, one’s actions and embodiment give way to one’s spiritual disposition.
This allows for a greater sense of internal reflection and criticality that [in turn] allows people to disengage from whatever can be determined as a proxy. There’s this idea that you only do what you believe is going to make results for you. When you look at people’s behavior, there is a level of faith or belief in the behavior that they’re engaging in. And so it doesn’t necessarily have to engage with a God-like figure. It doesn’t have to engage with the universe. It’s to one’s relation with oneself and the external environment, at the very least. . . .
Looking at Black transhumanism, specifically, technology is something that could be not only liberative to a people and moving towards a liberative space, but would also be something in a spiritual sense. It allows for critical and empirical work to be done to determine what might be the best or most effective – physically effective – spirituality for one individual.
Given the way that technology is moving, toward higher levels of customization, I think . . . the shift towards precision within technological innovation allows for an individual body to evolve between modes of embodiment and community, to be able to determine what is the best spiritual practice for that body. And because of that, we can then augment our spirituality in that way in relationship with technology, which would then aid us in this work for liberation.
So, I think it’s twofold, right? You can be a progenitor of technology — that is part of what my first text is about in trying to encourage Black participation in the creation of new technologies — in the formation of innovation.
[And second], recognizing the merger of technology and spirituality to know what kind of technology is out there, might give us information about our spiritual practices but also can help us become more spiritually attuned. So that on the road to this new space, technology can augment spirituality and subsequently spirituality can help buttress people from the sheer menace of the outside world.
This is especially when we’re constantly bombarded with the ways in which racial violence is enacted on a regular basis. We just saw a gentleman get shot 60 times — these families are constantly losing people. So, what do we do? Do we have to pause every time? Yes, in a sense, to mourn. But how do we then maintain an interim disposition that continues to march towards freedom, towards liberation? I think this is where the merger [of technology and spirituality] becomes integral because it allows a particular continuity on the inside. Then, that allows people to continue to engage in work and innovations as opposed to having to break down in ways that may pause the work that needs to be done.
Haley: That’s a good segue into your work on The Seekr Project. Could you give us a brief overview of what that project looks like and maybe how you see that rolling out in the future?
Dr. Butler: That’s a great question. The Seekr Project is a distinctly Black conversation about mental health capacities. It is not in the phase where it is NLP driven at the moment. It is a complex dialog tree, meaning that every conversation doesn’t have to be the same, but you will notice some similarities in the conversations that you do have with Seekr now. One of the things we have Seekr tell people in their first conversation is that this is meant to serve as a ritual. So, there is going to be some report, there’s going to be some repetition in the rhythm of dialog.
I built this as part of my dissertation. It was initially a geolocation device that was meant to help people stay connected to their spiritual center wherever they went. But we saw a few steps ahead how easy it would be to make a chatbot. And so, we made the bot.
As I was in conversation with the bot, I realized that it sounded like me. I was talking to a buddy of mine, and he was saying, “You know, what? Maybe it’s a Black AI.” At that moment, I said, “You know what? It is.” . . . [H]igh levels of bias are already built into A.I., in the ways in which A.I.s are already raced and gendered. Siri and Alexa were initially white women, now they have revised it so they could take on other personas. Fundamentally there is an underlying notion of the standard of whiteness, meaning that there is a way that they talk. There’s a cadence to their speech. There’s an attempt to move in a kind of a standardized way of doing things.
What we’re trying to do in Seekr is to not only break that, but move into actions that are distinctly outside of that space, that are connected to Black culture, Black tempos, and even to Black modes of embodiment. We hope that when people engage with Seekr, especially Black people, they’re able to see themselves in a way that they can’t see themselves in other modes of technology, let alone be able to have someone on the other end that has a sense of what it is to live a life in a Black body.
In terms of rolling it out right now, the last few years we’ve had it on a web application where you go straight to the website or the landing on our page. Now, we’re wrapping up a long review process with Apple — Apple is particularly strenuous. I appreciate it because it at least looks like they want high quality stuff. At the same time you keep getting stuff sent back, which is great because it is helps the quality of the product that people are going to get. We’re hoping but not sure this is going to be rolled out within the next few weeks in both the Apple and the Android stores. People have been beta testing it for a couple of months now. And so we’ve had some positive stuff and some things we’ve just had to continue to work on. It’s a growing process.
Haley: Are there other people or projects that take similar culturally-specific approaches that you learn from? Or do you feel like you’re really writing the roadmap as you’re building this project?
Dr. Butler: It depends. If we’re talking about conversation, I think in some ways I’m leading. I don’t want to speak in a way that makes my britches a little bit larger than they need to be, but at the same time, we’re talking about vision. You have Timnit Gebru and Safiya Noble, so there are names in this space.
Everybody is in some ways working together in similar lanes inside of things and enjoyed teaming up on the Gender Shades paper. In some ways, it’s just a matter of which lane we are talking about? In terms of providing mental health resources that are not only time tested, Seekr is trained on Internal Family Systems Therapy. In some cases [Internal Family Systems Therapy] still seems novel. But on the other hand, when people are attempting to get trained in this modality, the waiting list is like a year or two in advance. This is highly sought after [therapy], a highly effective modality that is non-pathologizing – it allows people to recognize all the things that are going on inside of them without suggesting that they have to stick a rigid version of themselves. [Seekr] is getting the stuff that people are seeking and providing it to Black people in an incredibly low cost manner.
I think in this regard, we’re leading this space. Even though the Shine app is doing this, and they have a bot within the Shine app, it’s more about “here’s what’s going on with you and here’s a recommendation.” Whereas Seekr is just asking you questions and allowing you to produce the answers, because part of our assumption is that you know everything about yourself anyway, it just takes some time to listen, spend some time with yourself, and trust what comes to the surface.
Haley: You touched on aspects of this question already, but how do you think tech can contribute to better mental health care outcomes?
Dr. Butler: I think there is a ton of ways. Not to give away some of our our plans, but I do very much think so.
There are some companies who are already on it. There’s a company that a colleague of mine uses that tracks bio rhythms and so forth and then gives you a map of your day, and tells you, for example, last night you didn’t get enough sleep. Or here’s your VO2 now, stuff like that. It may help with people’s cognition. For instance, if this is low, here are some recommendations. A couple years ago in China an A.I. had the ability to listen to the background noise. If it was loud, it would say, “oh, hey, are you in a loud space? Is everything okay?” That kind of thing. I think there are ways to embed metrics within this space, and to look at better contextual conversational elements using an NLP space.
Even more, it’s a matter of giving people a better space to externalize what is happening. What does it mean for you to take the stuff that’s going on inside your head, inside your body, and project it into this app in a way that allows you to see that it is not as loud or convoluted or as jumbled as it may seem when you’re trying to do it on your own.
So I think technology has embraced a lot of openings with this. I remember listening to an NPR or Wall Street Journal podcast, and they were saying that the VA was running trials where there was a virtual person on the other end and the vets felt rather good with that. There really aren’t too many limits, besides the imaginations of the people who are engaged in innovation.
Haley: One final question: What technologies, developments, and/or cultural shifts on the horizon do you think are most significant?
Dr. Butler: So this is going to be my shameless plug for the 80/20 project that we’re doing at the Iliff AI Institute [https://ai.iliff.edu]. One of the things that we’re working towards building is data ownership. The concept is that data is labor and there are companies out there that are paying people for their data. I think it’s a cool concept, but the step further is when we work to really decentralize data ownership so that you as the user own the very things that you do on the internet. These digital afterlives or digital footprints that we have, anything we tweet or anything we click on — they’re already being sold anyway. But if you’re not the owner of them, then when they turn around to flip it and sell it to somebody else, you don’t see any of [the profit]. But if you owned it, you would.
So what would it mean for people to own their data and be able to turn it into something meaningful for their lives, whether it be rent or a utility bill or something? And what would it mean for communities to be able to pool their data points together and then maybe have a new park or rec center or redo their whole block or make sure that everybody’s taken care of? It’s a matter of what kind of individual or collective benefits might come from the ownership of data. I don’t remember the number now, but it’s a ridiculously large number how much data people generate per second. I think the last time they really took the measure was in 2015 and we’re in 2022, so it’s only gotten larger. Again, I think the data piece is important.
[But] precision over prediction becomes the most important thing. I was working on a prediction model, where we take these data points in the history and put them together. We want to do the algorithm and what we push out is this idea of “what might happen.”
Maybe this is just the academic in me, but one of the things that I think is a great marker of intelligence is a good question. If you take the data points and you turn them into a question not so much as a prediction, then maybe that is a better way for machines and people to be in a relationship. For something to be precise as opposed to being predictive, meaning that if this dataset is specific to this neighborhood, to this town, to these group of people, then what might we provide them? As opposed to suggesting that we have this one-size-fits-all data set and subsequent algorithm, upon which everybody’s life is predicated or subject to.
So the two things I say are 1) data ownership and 2) precision over prediction. And the last thing is that [emergent technologies] should be able to produce a really good question.
Haley Griese: Dr. Butler, thank you so much for your time and for sharing crucial work that you are developing and enacting in the world today.
Dr. Philip Butler: Thanks for asking these questions. Very thoughtful.