Listen to the article
As the U.S. races to maintain its competitive edge in the global artificial intelligence sector, data centers are multiplying at an unprecedented rate across the country. However, a new analysis from the Pew Research Center reveals that this technological boom comes with significant costs—primarily a mounting strain on America’s energy and water infrastructure that could translate into higher utility bills and environmental challenges for the average citizen.
According to federal and international data reviewed by Pew, U.S. data centers consumed 183 terawatt-hours of electricity in 2024, representing approximately 4% of the nation’s total electricity usage. The International Energy Agency (IEA) notes this consumption roughly equals Pakistan’s entire annual electricity demand.
The primary culprits behind this surge are hyperscale facilities specifically designed to power AI models. Pew and IEA research indicates a typical AI-optimized hyperscale center consumes electricity equivalent to 100,000 homes annually. More concerning are the next-generation mega-facilities currently under development, which could use up to 20 times more power once operational.
These centers are becoming particularly concentrated in certain regions. In Northern Virginia, a major data center hub, these facilities now account for more than 25% of the state’s total electricity consumption, according to the Electric Power Research Institute.
The economic implications for consumers are becoming increasingly apparent. As utilities scramble to upgrade grid infrastructure to meet growing demand, those costs are inevitably passed on to ratepayers. In the PJM electricity market, which serves states from Illinois to North Carolina, Pew’s analysis of market data shows data center expansion has added $9.3 billion in capacity costs for 2025-26 alone.
This translates to tangible increases in monthly bills—approximately $18 more in western Maryland and $16 additional in Ohio households. Looking ahead, researchers at Carnegie Mellon University project that U.S. electricity bills could rise by 8% by 2030 solely due to data centers and cryptocurrency mining operations, with even steeper increases expected in regions with high concentrations of these facilities.
While electricity consumption dominates headlines, water usage represents what experts call “the quiet problem” in the AI infrastructure boom. Federal figures cited by Pew show that U.S. data centers consumed 17 billion gallons of water in 2023, primarily for cooling energy-intensive AI processing chips at hyperscale locations.
Projections suggest this water usage could nearly double by 2028, with hyperscale centers alone potentially consuming between 16 to 33 billion gallons annually—comparable to the yearly water consumption of a medium-sized American city. Though cooling requirements fluctuate seasonally and vary by facility design, researchers emphasize that the overall trajectory points to rapidly increasing water demand driven by AI growth.
Despite these significant resource implications, public opinion remains divided on AI’s environmental impact. A Pew survey conducted in August 2024 found no clear consensus: 25% of Americans believe AI will harm the environment, while 20% think it will ultimately prove beneficial. Another 25% expect a mixed outcome, and a substantial 30% remain uncertain.
This public uncertainty mirrors the broader reality facing policymakers and industry leaders: the United States is experiencing a resource-intensive technological transformation without a comprehensive long-term strategy to address its infrastructure demands and environmental consequences.
As AI continues its expansion into virtually every sector of the economy, the tension between technological advancement and resource sustainability presents a growing challenge for communities across the country, particularly those hosting these increasingly power-hungry data centers.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


7 Comments
This highlights the need for the AI industry to prioritize sustainability and work closely with energy providers. Perhaps incentives or mandates around renewable power and water conservation could help mitigate the strain on resources.
As the AI industry expands, it’s important we closely monitor the environmental impact and resource strain. Data centers consume massive amounts of electricity – we’ll need innovative solutions to improve efficiency and reduce the carbon footprint.
As AI continues to expand, we’ll need innovative solutions to manage the growing energy and resource demands. Collaboration between tech, energy, and policy leaders will be key to finding a sustainable path forward.
Wow, the electricity demands of AI-powered data centers are truly mind-boggling. I hope researchers and engineers can develop more energy-efficient technologies to support this critical industry without overburdening the grid.
The statistics on data center energy usage are staggering. I wonder what kinds of technologies or policies could help curb this consumption, like renewable energy, improved cooling systems, or stricter regulations. It’s a complex challenge to solve.
Fascinating insights into the rising energy demands of AI data centers. Clearly, we need to balance tech innovation with sustainable infrastructure planning to avoid overburdening the grid. Curious to see how industry and policymakers address this tricky balance.
The strain on US infrastructure from AI data centers is a sobering reality. I’m curious to see how industry and policymakers work to address these challenges through improved efficiency, renewable energy integration, and responsible growth strategies.