How AI Is Putting Data Center Infrastructure Under Pressure

AI (artificial intelligence) is moving fast—and the infrastructure behind it is feeling the strain. In a recent Forbes article, senior contributor and  technology expert Jennifer Kite-Powell breaks down how the rapid growth of AI is putting serious pressure on data center infrastructure and the energy grid that supports it. According to a June 2025 Deloitte survey, AI data centers already consume about four gigawatts of power—enough to supply roughly three million U.S. homes. By 2035, that number could skyrocket to more than 120 gigawatts, or over half of all American households. The impact shows up even in simple interactions. One ChatGPT query can use nearly ten times the electricity of a typical Google search, the International Energy Agency reports. And by 2028, Goldman Sachs Research estimates AI could account for nearly one-fifth of total data center power demand, even as efficiency improves. That demand isn’t just big—it’s unpredictable. Cirba Solutions CEO Klanecky explains that AI workloads create dense, always-on power usage that many local grids weren’t built to handle. When several data centers scale up in the same region, they can strain substations, stress transmission lines, and even push energy costs higher for nearby residents. Since building new power infrastructure takes time, operators are focusing on smarter upgrades inside existing facilities. Batteries, better cooling strategies, and drop-in efficiency technologies are gaining attention because they improve performance without downtime. Kite-Powell’s big  takeaway is that AI is forcing data centers to rethink everything at once. Power, cooling, storage, and sustainability can’t be solved in isolation anymore—and the operators who adapt fastest will be the ones who keep up.

 

For Full Article, Click Here