As artificial intelligence becomes a routine fixture in daily life, its invisible energy toll is rapidly coming into focus. A sweeping new investigation by MIT Technology Review, part of its “Power Hungry” series, pulls back the curtain on the massive and growing electricity demands of AI, exposing how each prompt, image, or video generation chips away at finite energy resources.
Drawing on data from Lawrence Berkeley National Lab, interviews with two dozen experts, and proprietary measurements of open-source models, the report maps a trajectory in which AI could soon consume more electricity than entire nations.
With major players like Meta, OpenAI, and Microsoft pouring billions into data center expansion, often powered by fossil fuels, the consequences for the planet are becoming impossible to ignore.
Why It Matters: The AI revolution is transforming the digital experience and redrawing our energy and climate future. The growth of AI’s carbon footprint, fueled by opaque corporate practices and insufficient oversight, risks turning a tool of progress into a driver of environmental degradation. Understanding the scale and trajectory of AI’s power consumption is critical for shaping responsible tech policy, infrastructure planning, and consumer transparency.
- Inference, Not Training, Is the Main Power Drain: While training AI models like GPT-4 is highly energy-intensive, consuming up to 50 gigawatt-hours and costing over $100 million, this phase is a one-time event. Inference, the everyday process of users prompting the AI for responses, is ongoing and accounts for 80–90% of current AI computing power. As AI use proliferates across applications, inference is poised to dominate future energy demands. This shift highlights the need to optimize everyday interactions, not just training methods.
- Every Query Adds Up: The energy required to answer a single AI prompt varies dramatically based on model size and complexity. A simple text request might use as little energy as running a microwave for a fraction of a second, while high-end video generation can consume enough electricity to power that same microwave for over an hour. These seemingly small amounts scale quickly, especially as models become more personalized and agentic. With over a billion AI queries daily, the cumulative toll is enormous.
- Data Centers Are Energy Black Holes: AI workloads are transforming data centers into some of the largest electricity consumers in the U.S., already drawing 4.4% of the national power supply. By 2028, data centers could demand up to 12% of U.S. electricity, with AI-specific uses consuming between 165 and 326 terawatt-hours annually. That’s more power than some entire countries use in a year. These facilities, often hidden from public view, are reshaping the national energy landscape.
- Fossil Fuels Still Dominate AI Energy Supply: Despite public commitments to sustainability, most AI data centers still draw power from fossil-fuel-heavy grids. In regions like Virginia, where many centers are located, over half of the electricity comes from natural gas. Clean alternatives like nuclear and solar are part of long-term plans but currently represent a minority share. As a result, the carbon intensity of AI operations can be nearly 50% higher than the national average.
- Consumers Could Bear the Cost: Many utility contracts with tech giants offer discounted rates for massive data center operations, shifting the financial burden onto ordinary customers. In Virginia, for example, a recent legislative report projected an average $37.50 monthly increase in residential electricity bills due to data center growth. These hidden costs often go unnoticed amid the hype around AI. Without greater transparency and regulation, consumers may end up subsidizing the AI boom.