While much of the attention about the AI phenomenon has been rightly focused on its breathtaking power to disrupt and transform everything it touches, there is great concern – even alarm – about the massive amounts of electrical energy it takes to power the physical hardware that has created what we now call “data factories”. This paper will briefly discuss the background, current challenges, and economic and technological dimensions of data factories; then offer an actionable energy strategy for system architects to achieve more economically viable and morally tenable systems to support AI.
Background
The supply chain of power grids in developed countries is being mightily strained. Municipalities everywhere are now pushing back on approving new projects because these giant data centers consume so much energy that these communities fear they will experience brownouts and grid instability. After these “hyperscale” data centers are built, they do not materially contribute to the local economy, partly because of tax incentives, but mostly because of how few workers are required to manage these giant sites. The money that moves these projects, however, is not aimed at the development of these local economies.
This excellent overall econometric report on the entire AI enterprise not only describes its scale but also the fragility of this ecosystem, from chip manufacturing to monetization of tokens. A disruption in any sector would stall the entire AI industry. No matter how powerful and capable AI platforms are, they still rely on tokens riding on silicon chips which are housed in facilities, constrained by the laws of physics in a physical world.
For the past 50 years, data center developers and operators designed, built and operated data centers according to the requirements set forth by server manufacturers and numerous standards bodies and their IT organizations. What went on inside those boxes was of little concern as long as the power density and reliability requirements were met. Power densities slowly increased as Moore’s Law of computational performance prediction continued to hold true.
Today, AI workloads in a single cabinet the size of a refrigerator can now exceed the power of over 500 homes. The largest data centers now consume more energy than the cities or metropolitan regions where they are built. From 2018 through 2024, data centers in the United States grew from 1.9% of total energy used, to over 6% in 2026. The DOE estimates it will reach 12% by 2028. Large developers must increasingly include all of the costs and permits to construct their own on-site power plants in order to build a data center; this may mean obtaining a pipeline and a position on natural gas—if they can get one. Many of the wind farms we see around the world are funded by large data center companies to offset their greenhouse gas emissions. Amazon and Microsoft are now contracting for Small Modular Reactors to produce carbon-free nuclear energy. While these efforts represent attempts to mitigate the problem of AI power demand, they are far more expensive to build and operate than traditional utilities and these costs must be passed on to end users at some point.
While companies work to minimize environmental impact from data center operations, the asymmetric growth means that even if energy from fossil fuels could be capped at—say—10 10% of data center energy use, we are still seeing exponential growth in total GHG emissions because of AI. To further complicate the problem, many businesses aim to “air-gap” their data to protect it from potential data leakage in shared cloud environments, so there is a trend for more private AI data centers, losing whatever economy of scale might be found in public cloud environments. This is especially important to Research and Healthcare institutions.
Sustainable, Ethical and Affordable Scalability
Notwithstanding promise of quantum computing or the potential of space-based data centers, I believe the appropriate answer to the question of resource and cost optimization is in computational efficiency combined with the correct power systems architecture and intelligent power networks. My research team at T-Mobile and those of other firms are working on portions of this problem but these efforts are made in isolation. Like any complex system, the current fragmented structures, from utilities to how the chips function, require a comprehensive, dynamically managed ontology. Hundreds of billions are being invested in infrastructure to support the AI phenomenon. The same tools that create the problem are ideally suited to solving it. What is needed is alignment across business siloes.
Aside from economics and affordability, there is a moral imperative to solve these problems in ways that steward the limited resources of our planet in equitable ways for humanity. The costs of neglecting this work are too great.
But who will take the lead?
Views and opinions expressed by authors and editors are their own and do not necessarily reflect the view of AI and Faith or any of its leadership.


