The AI boom is coming with a staggering price tag—$200 billion and enough electricity to power a major city.
A new study by researchers at Georgetown University, Epoch AI, and Rand Corporation reveals that building a leading-edge AI data center could soon cost more than most tech unicorns combined. By 2030, the world’s top AI facilities may require 2 million AI chips and draw 9 gigawatts (GW) of power — nearly the output of nine nuclear reactors.
The study examined over 500 AI data center projects from 2019 to early 2025, analyzing trends in capital costs, power requirements, and compute performance. The findings? We’re heading into uncharted territory.
How Much Does an AI Data Center Cost? Try $200 Billion
The report’s big takeaway: AI data center costs are doubling annually — and fast becoming one of the most capital-intensive sectors in tech.
Consider xAI’s Colossus project, reportedly valued at $7 billion and drawing 300 megawatts of power — enough to supply 250,000 homes. Now scale that up six years, and you get a potential $200B mega-center by 2030.
Major players like OpenAI, Microsoft, Google, and AWS are already racing to expand their compute capacity. OpenAI, for instance, is partnering with SoftBank to raise to $500 billion to build a sprawling network of next-gen AI data centers across the U.S. and potentially beyond.
Powering the AI Age: A Grid Under Pressure
While AI data centers have become 1.34x more energy efficient per watt annually, their total power draw is ballooning even faster — increasing 2x per year since 2019, according to the study.
By 2030, the leading AI hub may need 9 GW of electricity. To put that in perspective:
- That’s the entire power consumption of New York City during peak summer load.
- Or roughly 5% of total U.S. data center electricity use, projected to grow by 20% by the end of the decade.
This escalating demand could overwhelm renewable energy systems, which rely on inconsistent sources like wind and solar. Experts warn this may push utilities to reinvest in fossil fuels, just as the world is trying to cut back.
Environmental & Economic Fallout: Water, Land, and Taxes
AI data centers don’t just hog electricity. They also:
- Consume massive volumes of water for cooling (a serious concern in drought-prone regions)
- Occupy large swathes of land, often in suburban or rural areas
- Drain local tax revenues due to generous incentives
A Good Jobs First report found that at least 10 U.S. states lose $100M+ annually in tax revenue to data center deals, many of which offer property, sales, and energy tax breaks to attract big tech players.
A Tipping Point or Temporary Pause?
Some analysts are now questioning whether this hyper-expansion is sustainable.
A recent Cowen & Co. investor note highlighted a “cooling” in data center spending in early 2025. Even giants like AWS and Microsoft are reportedly delaying some projects due to cost concerns and energy constraints.
Still, if demand for generative AI tools continues on its current trajectory, many experts believe we’ll hit the $200 billion mark within the next 6 years — setting up a fierce battle over resources, regulation, and environmental impact.
Get the Latest AI News on AI Content Minds Blog