Researchers question AI data centers’ ‘eye-popping’ energy demands

Jonathan Koomey remembers the hype around electricity demand during the dot-com bubble. Now, he is raising the alarm over stark parallels he sees between that decades-old speculation and the projected energy demand of artificial intelligence data centers.
Regional grid operators in the U.S. are projecting major increases in the amount of power required in the coming years. Artificial intelligence is the primary driver behind the trend. The need to build capacity on the power grid is already increasing costs, as utilities scramble to build power plants and Big Tech invests in nuclear power. However, some experts are skeptical of the AI-related electric growth projections, warning that they could lead to higher prices for consumers.
In an interview with Straight Arrow News, Koomey described how, in the late 1990s, many people believed that computers would use half of all the electricity produced in the U.S. within a decade or two.
“It turned out that across the board, these claims were vast exaggerations,” said Koomey, who has spent his career researching the energy and environmental effects of information technology, including more than two decades as a scientist at the Lawrence Berkeley National Laboratory.
Koomey is part of a growing number of researchers and consumer advocates who worry that the power consumption hype is playing out again with AI.
“Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use,” Koomey said.
How much is electricity demand expected to spike?
Across the country, the regional power grid operators that manage real-time supply and demand to keep power flowing are singing the same song: America needs more power.
Unbiased. Straight Facts.TM
America’s largest regional power grid expects a 42% increase in peak demand for electricity by 2040.

In addition to AI data centers, cloud computing, cryptocurrency mining, new manufacturing facilities, and the electrification of heating and transportation also contribute to growing electricity demand. But artificial intelligence is the fastest-growing factor driving power demands.
The largest electric grid, PJM, expects to see peak electricity demand — the maximum amount of power needed in a single instant — increase by 70,000 megawatts in the next 15 years. That’s 42% more than the current peak on the grid that serves 65 million people from Washington, D.C. to Chicago.
In Texas, the gains are expected to be faster. Peak demand is projected to nearly double from 2024 levels by 2030, according to the state’s major grid operator, the Electric Reliability Council of Texas.
Nationwide, estimates vary. Grid Strategies, a consulting company in the power sector, projects that the U.S. will need 128 gigawatts of power — a nearly 16% increase — by 2029. Some estimates are even higher. Consulting firm ICF expects approximately a 25% higher demand nationwide by 2030 and 78% more power needed by 2050, with 2023 serving as the baseline.
For most Americans, electricity rates are already rising. And as demand for power increases, consumers are likely to pay even more.
What are ‘phantom’ data centers?
Utility companies are also counting duplicate data centers in their growth projections.
“One data center may shop around a few different locations before it decides where it finally wants to be,” said Cathy Kunkel, a consultant at the Institute for Energy Economics and Financial Analysis.
When a tech company wants to build a data center, it files a request with a utility company to connect to the grid. The utility companies report how many requests they’ve received to regional grid operators like ERCOT and PJM, and the grid operators use that information to estimate how much electricity demand will grow.
However, utility companies do not typically communicate with each other to determine whether they are receiving duplicate requests from the same technology companies. This makes it difficult to gauge future electricity needs accurately. The Wall Street Journal recently reported that some utility companies’ projections for future power demand are multiples higher than the existing peak demand.
“Many data centers that are talked about and proposed and in some cases even announced will never get built,” said Sean O’Leary, a senior researcher at the Ohio River Valley Institute, in an interview with SAN.
Nevertheless, utility companies are racing to secure more power sources to meet growing demands.
Where are data centers getting the power?
Although Big Tech has announced investments in nuclear power, some of those deals are only intended to keep existing plants online. Some investments, such as Google’s plans to build unproven nuclear fusion, are promising but represent speculative, long-term bets.
To meet immediate new demand, many utility companies are turning to gas-fired power plants. Entergy Louisiana recently received state approval to build three new gas power plants to serve a Meta data center currently under construction. The power plants will add enough electricity to the grid to power two new cities the size of New Orleans.
Nationwide, about 114,000 megawatts of new gas power plants are currently being built or in a pre-construction planning phase, according to reporting from Reuters.
When a utility company seeks regulators’ approval for a new power plant, the company also asks for a guaranteed profit margin on the new infrastructure investments, which typically comes from increased rates.
O’Leary told SAN the “utility is going to be able to recover the cost of that power plant plus a commercial rate of return, whether or not that plant is ultimately needed or not.”
One similarity Koomey sees between the dot-com boom and today’s electricity growth is that “there’s a whole ecosystem of people who are very willing to make simple extrapolations,” and it’s in the interest of both Big Tech and the utility companies to generate “eye-popping numbers.”
Are the AI companies profitable?
Critics of the AI growth projections point out that the technology has not yet proven it can be a source of profit.
“The forecasts that we’re seeing right now are basically what the tech industry wants to happen and what they’re selling to their investors,” Kunkel said. “But the reality is that their financials don’t match that picture that they’re painting.”
In 2024, OpenAI, the company behind ChatGPT, reported a $5 billion loss, according to CNBC. Anthropic, which runs the chatbot Claude, also did not profit in 2024, and one recent analysis suggests the company is losing money on its paid subscribers. Nevertheless, the promise of artificial intelligence continues to attract new investors and valuations in the hundreds of billions for these new pure AI companies.
Meanwhile, Big Tech companies like Meta and Google are racing to recruit the best AI researchers, spending hundreds of millions on individual salaries.
So far, Koomey and other critics are skeptical that the public’s interest in AI will match the investments that Big Tech is making. Referring to recent viral incidents of Google’s AI tools offering flawed results, Koomey said, “These things still tell you to eat rocks and put glue in your pizza.”
“It would not be weird for an emerging technology to suddenly hit a ceiling in terms of popularity,” said Alex de Vries, a researcher at the university VU Amsterdam.
What are the physical constraints of meeting demand?
Even if consumer demand for AI matches Big Tech’s ambitions, the industry might first run into physical constraints.
Alex de Vries is the founder of the website digiconomist, where he writes about the economics of digital trends, including artificial intelligence.
In an interview with SAN, de Vries said tech companies are already using all the advanced computer chips that manufacturers can produce. On one hand, this suggests a growing demand for AI systems, but a recent analysis from London Economics International found that supply-chain constraints mean many proposed data centers will not be able to obtain the necessary AI chips.
“You can’t really predict much further than like one to two years into the future,” de Vries said, because “the supply chain capacity is fully utilized.”
Will AI models get more efficient?
The power needs of artificial intelligence also vary depending on the use case. For example, creating an AI-generated video requires much more energy than interacting with a text-based AI chatbot. New “reasoning” models capable of advanced analysis will require more power than conventional large language models, de Vries said.
Energy efficiency also varies depending on the AI model. Earlier this year, Chinese AI model DeepSeek demonstrated a functionality similar to ChatGPT while using up to 90% less electricity.
“For a data center, the single largest variable cost is the electric bill,” said O’Leary, who expects Big Tech to look for ways their AI models can become more energy efficient.
However, Tyson Slocum, director of the energy program at the nonprofit Public Citizen, noted that consumers are already experiencing increased electric bills as the market adjusts to increased electricity demand.
In an interview with SAN, Slocum said utility companies and the tech industry “intentionally create a sense of hype and panic around the energy consumption of AI, because both had a shared financial interest in doing so.”
The post Researchers question AI data centers’ ‘eye-popping’ energy demands appeared first on Straight Arrow News.