The rapid expansion of artificial intelligence (AI), particularly large language models (LLMs) and generative AI, has driven an unprecedented surge in energy demand due to the computational intensity of training and operating these systems. Eric Schmidt, former Google CEO, has highlighted electricity as the primary limiter of AI growth, estimating that the U.S. will require an additional 92 gigawatts (GW) of power—equivalent to the output of 92 nuclear power plants—to sustain the AI revolution. This analysis explores the current energy consumption of major companies’ AI infrastructure, projects future energy needs through 2035, and examines how these demands will reshape the energy sector, drawing on available data from web sources and posts on X.
Current Energy Consumption by Major Companies
Overview
Major tech companies, or “hyperscalers” (e.g., Microsoft, Google, Meta, Amazon, OpenAI), are the primary drivers of AI infrastructure energy demand, operating massive data centers for training and inference of AI models. Training a single state-of-the-art AI model, such as OpenAI’s GPT-4, can consume 50 gigawatt-hours (GWh) of electricity, equivalent to the annual energy use of 4,800 U.S. households. Inference (running AI models for user queries) is also energy-intensive, with a single ChatGPT query requiring approximately 2.9 watt-hours, nearly 10 times that of a Google search (0.3 watt-hours). Below is an overview of key players’ energy footprints based on available data:

Read on →