Please consider supporting us by disabling your content blocker.
loader

OpenAI logo and ChatGPT image

WASHINGTON (TND) — Data center power demand is projected to grow by 160% by 2030, primarily driven by advancements in artificial intelligence, according to a new study from Goldman Sachs Research.

Understanding the Increased Demand

Chrysta Castaneda, an energy industry lawyer, emphasizes that the bandwidth required for AI operations is significantly higher than most people realize. “So much more computers are needed and so much more electricity is needed to run the computers,” she stated.

Comparative Energy Usage

The International Energy Agency reports that a single ChatGPT query consumes 2.9 watt-hours of electricity, compared to just 0.3 watt-hours for a Google search. This stark difference highlights the growing energy demands of AI technologies.

Challenges Ahead

Castaneda warns, “There are going to be a lot of challenges of power management over the next decade or so while we get adjusted to the new reality.” Michael Nowatkowski, a professor of cyber sciences at Augusta University, adds that the surge in power usage for AI could lead to significant issues.

“That could cause more severe brownouts or even loss of power depending on the demand placed on the power grid,” said Nowatkowski.

Impact on Electricity Prices

Fariba Mamaghani from Tulane University’s Business School notes that electricity prices will likely rise in response to increased demand. “Electricity will go up and they will experience the higher electricity bills,” she explained.

Regional Impacts

The effects of this increased demand will be particularly felt by residents near data centers in regions such as Northern Virginia, Dallas, and Silicon Valley. Mamaghani expressed concern, stating, “It’s concerning because it will impact end users in these scenarios.” According to Goldman Sachs Research, AI is expected to account for approximately 19% of total data center power demand by 2028.