Missional AI (MAI) is an annual conference held at Wycliffe World Headquarters in Orlando, Florida. The conference is a space for Christian scholars, pastors, entrepreneurs, and developers to consider how to apply artificial intelligence and machine learning to further the kingdom of God. AI&Faith had a strong presence at the MAI conference with advisor Adam Graber, executive leadership team member Thomas Osborn, senior editor Marcus Schwarting, and advisor Joanna Ng speaking. On the second day of the conference, Adam Graber and Marcus Schwarting presented on the importance of shared ethical standards surrounding large language models (LLMs) such as ChatGPT, Gemini, and Alpaca/Lora. The presentation was followed by a breakout discussion led by Thomas Osborn on a potential standard benchmark for AI systems being leveraged by Christian organizations. For more information on the MAI conferences and links to all the talks, see here.
During the conference, several key themes emerged.
The first theme of MAI was the continued efforts towards AI-aided Bible translation tasks. Several new tools were introduced at the conference, including “Greek Room” by Joel Mathew and Ulf Hermjakob, “Lynx” by Damien Daspit, and tools for Bible translation steering by Ryder Wishart. Aside from text-only translation methods, there has been a renewed focus on applying AI within other linguistic modalities including oral Bible translation and sign language Bible translation. Aiding researchers and linguists in translating the Bible has been a core component of MAI since its inception, and that continued this year.
The second theme of MAI was the use of large language models (LLMs). With several noted exceptions, nearly every talk that was not focused on Bible translation had incorporated LLMs in some fashion. This is a marked departure from MAI in 2023, where a wider variety of model architectures and techniques were being employed across a range of datasets. Many of the talks in 2024 introduced chatbots designed as a retrieval augmented generation (RAG) system. RAGs operate under a two-step process to generate responses which are (hopefully) strongly rooted in a trusted corpus such as the Bible, with retrieved verses and passages providing a basis around which an LLM can (ideally) formulate a reliable response to a user’s query.
The final theme of MAI was the underlying concern around ethical standards and common best practices. Within Bible translation, there is an ever-present (and well-justified) concern that AI will compromise translation quality, especially for low-resource languages. For chatbots built to respond to user queries about Christian faith and doctrine, there was a clear feeling of uneasiness around the questionable reliability that is inherent to any such transformer-based architecture. If and how these reliability concerns can be adequately addressed remains an important objective for AI&Faith going forward, and will be an important factor for Christian organizations who wish to utilize LLMs while simultaneously ensuring that the models represent the organization responsibly.
The 2024 MAI conference was a time to connect with like-minded Christian innovators and developers to see where and how AI should be used serve God. For all the challenges that come with these tools, I believe that we have been given them to do good. It was a great encouragement to see others leveraging their talents to make that belief a reality.