Business
As AI Demand Surges, Energy-Hungry Data Centers Boom

As virtual meetings and AI interactions become commonplace, the infrastructure supporting these technologies remains largely unseen. Data centers—vast facilities housing the servers that power AI systems—are essential for processing information at lightning speeds.
Vijay Gadepally, a senior scientist at MIT’s Lincoln Laboratory, notes the immense scale of these operations. Each AI model needs substantial computing power, which means data centers must continuously expand to meet growing user demands. “If your millions or billions of users are talking to the system simultaneously, the computing systems have to really grow,” he explains.
The United States is positioning itself as a leader in the global AI landscape, boasting hundreds of data centers. While AI tools may appear to exist solely online, the facilities that enable their functionality consume significant resources, including energy and water.
Jennifer Brandon, a science and sustainability consultant, emphasizes the environmental impact of these centers. As AI usage rises, so does the strain on local energy grids. “We definitely try to think about the climate side of it with a critical eye,” she remarks.
Historically, the U.S. has transitioned from large desktops to sleek laptops, creating a demand for more robust data infrastructure. David Acosta, cofounder of ARBOai, explains that while large language models (LLMs) and machine learning (ML) technologies have been researched for decades, their recent rise in popularity among everyday users has accelerated the need for data centers.
To train and process AI models, substantial hardware is required, housed in data centers that have multiplied since the early 2000s. The shift to cloud storage further escalated demands for expansive facilities. Current estimates suggest the U.S. hosts more than 3,600 data centers, with approximately 80% located in just 15 states. Virginia leads with around 600 facilities, followed by Texas and California.
Investment in data center infrastructure is surging. Major firms, including BlackRock and Microsoft, committed $30 billion to develop new facilities, reflecting the market’s significant growth. “If you own the data, you have the power,” Acosta states, urging for ethical oversight in the sector.
Data centers currently account for about 2% of the U.S. energy demand, but projections indicate this could rise to 10% by 2027. The development of data centers offers financial benefits to communities but also poses environmental challenges. Local governments often face pressure to balance economic growth with the needs of residents.
As the demand for electricity grows, concerns about rising utility costs for local communities have emerged. In Georgia, legislators are pushing to protect residents from increases stemming from the energy needs of new data centers. Some areas are exploring sustainable energy options, such as Pennsylvania’s initiative to use a nuclear power plant to supply clean energy to nearby facilities.
Cooling systems in data centers represent another significant resource demand. Approximately 40% of their energy consumption goes toward cooling equipment, often relying on large amounts of water. Brandon highlights the staggering scale of this consumption, comparing it to the resources of entire countries.
Looking ahead, energy efficiency is becoming indispensable for the future of AI development. Innovations, such as China’s DeepSeek which aims to optimize energy use in its models, suggest a shift in focus towards sustainable practices. Gadepally indicates that companies must consider whether the benefits justify the environmental costs associated with excessive power use.
Initiatives for localized AI tools could reduce the reliance on extensive data centers. By creating specialized models for particular sectors, organizations can minimize their environmental footprint and costs. The ongoing demand for AI capabilities shows no signs of abating, yet the imperative for energy efficiency may eventually reshape industry practices.
Tech leaders are beginning to realize the importance of sustainable practices as they invest in AI technology. As Gadepally asserts, optimizing resources could lead to significant savings without compromising performance. The future of data centers hinges on balancing innovation with responsibility.