AI Data Centers: 3x Demand, Billion-Dollar Projects, New Energy Sources
Infrastructure is struggling to keep up, according to one expert source — an AI model
Welcome to the Cloud Database Report. I’m John Foley, a long-time tech journalist who has also worked in strategic comms at Oracle, IBM, and MongoDB. Now I’m an independent tech writer. Connect with me on LinkedIn.
The terms hot and cool can sometimes be used interchangeably. For example, a new Ferrari might be described as hot by some people and cool by others. Linguists call this construction, where seeming opposite words have the same meaning, antonymous synonyms.
Data centers can be both hot and cool. Hot in the sense that hyperscalers are building data centers as fast as they can. And cool in the sense that they require lots of ventilation and liquid cooling to keep from overheating.
All of this is getting a lot of attention these days, as AI training, inference, and workloads drive a boom in data center construction around the world. Hyperscalers are on track to spend more than $300 billion on CapEx in 2025, much of that going to data centers and GPUs.
I’ve been reporting on data centers for many years. I took the photo above, of a Google data center located along the Columbia River, on a trip to Oregon several years ago. On another trip to Oregon, I once drove out to an Amazon data center that was under construction only to find that the project had been temporarily halted. You’ll find more details about that lower in this article.
But first, let’s catch up on some of what’s been happening.
Meta is seeking $29 billion in funding for data centers, according to The Financial Times. However, the New York Times suggests the world is being split into AI haves and have-nots, which it illustrates with a global map of where AI data centers are located.
EV batteries are now being repurposed to power an AI data center, as you can see from the photo in this MIT Tech Review article. While in Texas, data centers have begun using their own gas-fueled power plants on site.
Oracle plans to buy 400,000 NVIDIA GPUs as part of the $500 billion Stargate project, according to reports. NVIDIA says its Blackwell Tensor Core architecture offers efficiency gains of up to 25x and 50x, compared to its earlier generation NVIDIA H100 Tensor Core baseline.
Energy per prompt
Joanna Stern, tech reporter for the Wall Street Journal, has written an interesting article on AI energy requirements, “How Much Energy Does Your AI Prompt Use? I Went to a Data Center to Find Out.”
Stern found it difficult to get a satisfactory answer to that question. Google, Microsoft and Meta all declined to provide details. OpenAI’s Sam Altman came closest by responding that the average ChatGPT query uses about 0.34 watt-hours of energy, but didn’t detail the variance for different data types. “Tech companies need to tell us more about the energy they’re using on our behalf,” she writes.
Agreed, but there’s also this data point: Stern talked to Hugging Face, which has run its own energy-consumption tests and found that the power usage to generate content ranges from .17 watt hours on the low end to as much as 110 watt hours.
The industry trend is toward more efficient AI models and processors, which presumably could lower the energy-per-query ratio (my assumption anyway), but there will be more queries, more workloads, and more data centers, so the overall energy requirements are increasing. “Demand for these GPU-filled buildings keeps multiplying,” Stern writes. Which is why we’re talking about data centers with their own gas-fueled power plants and nuclear reactors.
Demand and supply
Two key points here: Demand for AI data center capacity is growing fast. And energy supply is struggling to keep up.
On point #1, McKinsey predicts that global capacity for data center demand could nearly triple by 2030 “potentially resulting in a serious supply deficit.”
On point #2, I asked Perplexity, “Are energy power requirements able to keep up with demand for AI data centers?”
Following is Perplexity’s response, unedited and in its totality (within the quotation marks):
“The rapid growth of AI is driving a steep increase in power demand for data centers, but current infrastructure is struggling to keep up. In the U.S., data center power demand is projected to more than double by 2035, rising from about 35 gigawatts in 2024 to 78 gigawatts, with actual energy consumption nearly tripling in the same period 1. Globally, AI data centers may require up to 68 gigawatts by 2027, nearly matching California’s total 2022 power capacity 2.
However, there are significant challenges:
Grid Capacity and Power Shortages: By 2025, up to 40% of AI data centers may not have enough power to function fully due to grid capacity constraints 3. Some regions already face long wait times for grid connections, with waits of up to seven years in some cases 4.
Infrastructure and Supply Chain Issues: Building new power generation and transmission infrastructure is slow, and supply chain disruptions are compounding delays 54.
Rising Prices and Climate Concerns: The surge in demand is expected to increase electricity prices and make it harder to reduce carbon emissions 3.
Efforts to address these issues include adopting more efficient hardware, using renewable energy, and implementing energy storage solutions like thermal energy storage to help manage peak demand 6. Despite these innovations, the current pace of infrastructure development may not be sufficient to fully meet the rapidly growing power needs of AI data centers in the short to medium term 45.”
Driving to the Port of Morrow in a borrowed pickup truck
It’s worth remembering that data center demand can ebb and flow. This past January, the release of a DeepSeek AI model triggered a 3% drop in the Nasdaq due in part to concerns that more efficient models would require less processing power.
I have some first hand experience with data center construction — and lack thereof. Back in 2009, there were reports that Amazon was building a data center at a 60-acre site in Boardman, Oregon, along the Columbia River. I happened to be in Oregon at the time, so I drove out to see for myself in a borrowed pickup truck.
The road to the data center had a barrier across it, but having driven this far, I wasn’t about to turn back. So I drove around the barricades. On the other side of the fenced property, tumble weeds rolled past half-constructed buildings.
As it turned out, the project had stalled as I reported in InformationWeek. My article originally included a photo essay, but the link is now broken. Here’s what I wrote at the time:
For now, Amazon’s Oregon data center sits idle at the end of newly paved road that’s now blocked to traffic. Its neighbors at the Port of Morrow include RDO Calbee Foods’ potato-processing plant and Reklaim Technologies' tire recycling facility.
Amazon’s site, surrounded by an eight-foot fence topped with barbed wire, is located approximately three-quarters of a mile from the Columbia. The only activity during my visit was a backhoe digging a trench. The building is a partially completed shell and appeared to be empty, with no servers, cooling systems, or other equipment visible.
At the time, Amazon declined to comment, but local officials pointed to economic conditions as cause for the delay.
How long will the current data center boom continue? Some are already predicting a CapEx pullback as early as next year. I’ll believe it when I see it with my own eyes on the backroads of Oregon.