Next Article in Journal
Applications of X-Ray Computed Tomography Technology to Solid–Liquid Phase Change Materials—A Review
Previous Article in Journal
Conceptual Analysis of Intercooled Recuperated Aero-Engines (IRA)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

The Energy Hunger of AI: Large Language Models as Challenges and Enablers for Sustainable Energy

1
CoE “National Center of Mechatronics and Clean Technologies”, 1000 Sofia, Bulgaria
2
Department of Computer Systems, Faculty of Computer Systems and Technologies, Technical University of Sofia, 1000 Sofia, Bulgaria
Energies 2025, 18(17), 4701; https://doi.org/10.3390/en18174701
Submission received: 27 August 2025 / Accepted: 3 September 2025 / Published: 4 September 2025

1. Introduction

Artificial Intelligence (AI) is becoming one of the defining technological drivers of the 21st century, with transformative applications across science, industry, and society. In the energy sector, AI already demonstrates significant potential for optimizing demand forecasting, supporting renewable integration, enabling predictive maintenance, and strengthening system resilience. Yet, as AI advances, it also introduces a paradox: the very models that promise efficiency and sustainability are themselves highly energy-intensive [1,2].
The emergence of Large Language Models (LLMs) has epitomized this paradox. LLMs such as GPT, BERT, and LLaMA represent the forefront of natural language processing, enabling human-like interactions and decision-support capabilities. However, the computational burden of training and deploying these models has escalated to unprecedented levels. Training a single state-of-the-art LLM can consume hundreds of megawatt-hours of electricity, while inference—the process of serving billions of queries—creates a continuous and growing global energy demand. This “energy hunger” raises questions about the environmental footprint of AI, its carbon emissions, and the sustainability of its supporting digital infrastructures [3,4].
At the same time, AI and LLMs promise to contribute solutions to the very energy challenges they exacerbate. By processing vast datasets, LLMs can assist in the rapid analysis of technical standards, grid operation logs, and energy market regulations. Combined with other AI paradigms, they can enhance situational awareness for operators, improve energy trading strategies, and support policy-making with data-driven insights. Thus, the intersection of AI and energy is both a source of concern and a landscape of opportunity [5,6].
This Editorial argues that the debate must move beyond viewing AI solely as an energy consumer. Instead, a holistic perspective is required—one that addresses both the supply side (how to provide sustainable, secure energy for AI infrastructures) and the demand side (how to design greener algorithms and efficient AI architectures). Moreover, leveraging AI itself, including LLM-driven systems, offers a path toward balancing the dual role of AI as both an energy challenge and a sustainability enabler [7,8].
In what follows, we examine the rapidly growing energy demand of AI with an emphasis on LLMs, outline pathways for sustainable and secure electricity supply for AI infrastructures, and show where AI-LLMs, in particular, can measurably improve efficiency and resilience. We close with a discussion of the implications for energy security and governance.
By highlighting both the risks and opportunities, this article seeks to stimulate a constructive dialog on how to reconcile AI’s energy footprint with global decarbonization and sustainability objectives.

2. Energy Demand from AI

The rapid rise of Artificial Intelligence—especially LLMs—has sharply increased computational and energy requirements. State-of-the-art LLMs involve billions of parameters and trillions of tokens; training alone can consume hundreds of megawatt-hours (e.g., GPT-3 ≈ 1287 MWh), and repeated fine-tuning compounds the footprint [9]. Yet, over a model’s lifetime, inference typically dominates energy use: once deployed, LLM services handle billions of queries under latency and availability constraints, turning energy demand into a continuous load [10,11]. The realized footprint depends not only on model size and serving efficiency (batching, caching, quantization) but also on data center performance and siting—commonly tracked via Power Usage Effectiveness (PUE) and Water Usage Effectiveness (WUE), alongside growing reporting on water use and heat reuse practices [12,13,14].
At system scale, AI is emerging among the fastest-growing digital loads. While estimates vary, data centers already consume ~1 ÷ 1.5% of global electricity, and AI workloads account for a rising share; daily use of popular LLM services corresponds to multi-GW data center capacity in aggregate [10,11]. Comparisons with other technologies (e.g., crypto-mining) are method-dependent and time-sensitive; therefore, this Editorial focuses on the operational drivers that matter for planning and policy: model and workload characteristics, serving efficiency, PUE/WUE, and geographically aware siting.
These trends have three implications. First, without low-carbon supply, additional AI load increases lifecycle emissions. Second, temporal clustering of training and diurnal inference can stress local grids unless coordinated with flexibility and storage. Third, rapid AI build-outs may compete for scarce renewable capacity, heightening the need for additionality, water/heat reuse, and transparent reporting of energy and water metrics [13,15].

3. Energy Supply for AI

The exponential growth of AI workloads, especially from LLMs, is shifting data center electricity demand from a marginal to a system-relevant load. Meeting this demand sustainably means not just adding more generation, but aligning how, when, and where power is procured and used with climate and security goals. In practice, this centers on three pillars: clean supply, flexibility, and resilience [10,11,16,17].
On the supply side, high-additionality renewables (co-located or contracted) should anchor AI power portfolios, complemented by 24/7 temporal matching where feasible (clean MWh in the same hours the load occurs). Regarding siting and procurement, behind-the-meter PPAs (long-term contract for on-site RES/storage located behind the facility’s electricity meter), on-site PV/wind, and campus microgrids reduce dependence on congested transmission and improve reliability. Good practice includes publishing PUE/WUE and water/heat reuse metrics, choosing cooling options consistent with ASHRAE envelopes (recommended and permissible temperature and humidity ranges for IT equipment in data centers), and prioritizing locations with low-carbon grids and sufficient water resources [10,17,18].
Because renewables are variable, flexibility bridges the gap. Battery energy storage (BESS) handles sub-hourly balancing; long-duration storage (pumped hydro, thermal, hydrogen) covers multi-hour/diurnal needs. Crucially, AI offers a uniquely shiftable workload: most training jobs can be scheduled or slowed to follow periods of renewable surplus, while inference—constrained by latency SLAs—benefits from efficient serving (batching, caching, quantization) and modest buffering. Together, storage + workload shaping reduce residual fossil ramping and curtailment [10,11,16,18].
For always-on capacity, dispatchable low-carbon sources (e.g., nuclear options, including SMRs, encompassing the use of geothermal and sustainable hydro/biomass where appropriate) provide stability for large AI clusters and interconnection-limited regions. In practice, a hybrid clean mix—RES + storage + a firm low-carbon tranche—minimizes cost and risk across weather and market conditions [10,11,16].
Finally, grid integration and security are decisive. Rapid interconnection, voltage support, and participation in demand response turn data centers from passive loads into grid assets. Heat from AI clusters can feed district heating, and water stewardship policies mitigate local impacts (tracked via WUE). Because AI/data center sites are cyber-physical nodes, operators should adopt baseline controls aligned with North American Electric Reliability Corporation Critical Infrastructure Protection (NERC CIP—where applicable), the updated European cybersecurity directive (NIS2), and standard ISO/IEC 27001, with secure-by-design networking, access control, and incident response [17,18,19,20,21].
In sum, a credible pathway to sustainable AI supply involves diversification, localization, and decisive decarbonization, encompassing high-additionality renewables with temporal matching; storage and workload flexibility to follow the sun and wind; and a firm low-carbon layer plus modernized, secure grid interfaces. These factors could turn AI clusters from net stressors into partners of the clean-energy transition [10,11,18,19,20,21].

4. AI for Energy Optimization

While AI’s rising electricity demand—especially in relation to LLMs—is a concern, these technologies also enable practical gains across the power system. Modern deep-learning methods improve short-term load and renewable forecasts, lowering reserve margins and curtailment. Reinforcement and hybrid controllers coordinate storage and flexible demand more effectively, and data-driven dispatch reduces operational frictions in complex conditions [22]. Crucially, LLMs act as operator copilots: with retrieval-augmented generation, they bridge structured telemetry and unstructured sources (procedures, manuals, incident logs), producing concise, context-aware guidance and checklists for control rooms. This makes heterogeneous information legible under time pressure and supports faster, better-justified decisions [23].
On the asset side, AI-based predictive maintenance detects early fault signatures in turbines, transformers, and batteries, improving availability and cutting O&M cost per MW. In market operations, AI sharpens price forecasts and bidding strategies, while LLMs summarize regulatory changes and news and explain model outputs for human-in-the-loop approval—linking quantitative models to the textual information streams that shape market behavior [10,22,23]. More broadly, human–AI collaboration matters: LLMs provide multilingual, domain-adapted interfaces so operators, engineers, and planners can query systems in natural language and receive traceable rationales tied to underlying data.
Applied responsibly, AI for energy should not exacerbate the energy problem it seeks to solve. The following steps could be taken: deploying green-by-default practices; applying right-size models to certain tasks; showing a preference for compression/distillation; using parameter-efficient fine-tuning; adopting low-bit quantization and (where appropriate) sparse/MoE routing (conditional computation where only a few expert sub-networks are activated per token—reducing floating-point operations (FLOPs) and energy per token while retaining high model capacity); and running efficient serving backed by rigorous Machine Learning Operations (MLOps) and continuous evaluation. Other key steps could include reporting operational values via clear Key Performance Indicators (KPIs), forecast error reduction, and avoiding curtailment and downtime, alongside energy/water metrics for the AI service itself (e.g., energy per 1 k tokens and site-level PUE/WUE) and keeping deployments accountable regarding both performance and footprint [24,25,26,27,28].

5. AI, Energy Efficiency and Security

Beyond optimizing day-to-day operations, AI can lift both the efficiency and the security of energy systems—provided its new risks are managed deliberately. On the model side, a green-by-default approach is essential: compression/pruning and knowledge distillation shrink computation; low-bit quantization and sparse/Mixture-of-Experts routing cut joules per token; parameter-efficient fine-tuning limits retraining; and federated/edge learning reduces reliance on centralized, energy-intensive data centers. Operationally, right-sizing models to tasks, efficient serving, and accelerator-aware scheduling keep the footprint in check. These practices lower cost and emissions while preserving utility and should be reported with energy/water metrics (e.g., energy per 1 k tokens, PUE/WUE) alongside accuracy to align AI with climate targets [24,25,26,27,28].
System-wide, AI improves the efficiency of networks by sharpening unit commitment and dispatch, optimizing storage utilization, and reducing losses. LLMs act as a knowledge bridge, turning heterogeneous inputs—structured telemetry plus unstructured manuals, procedures, and incident logs—into operator-ready guidance and explanations, which shortens decision cycles and supports human-in-the-loop control rooms [22,23].
Security is inherently dual-use. Defensively, AI strengthens cyber-situational awareness via anomaly detection, log triage, and faster incident response. Offensively, the same techniques (and LLMs) can be misused to craft more sophisticated attacks or disinformation. Mitigations require secure-by-design architectures, strong access control and segmentation, patching/monitoring, supply chain assurance (SBOMs), and mandatory incident reporting where applicable—aligned with established frameworks such as NERC CIP (North America), NIS2 (EU), and ISO/IEC 27001 (global) [18,19,20].
Finally, AI contributes to resilience during blackouts and amidst renewable variability and cyber incidents. Data-driven models and LLM-based assistants offer rapid, context-aware recommendations that integrate technical, operational, and regulatory perspectives—supporting restoration, islanding strategies, and post-event learning. The net result is a portfolio that is not only smarter and more efficient but also measurably safer and faster to recover when stressed [29,30,31].

6. Conclusions and Outlook

The accelerating adoption of AI presents a paradox for the global energy transition. On one hand, AI—particularly AI based on LLMs—is among the most energy-intensive digital technologies ever developed, driving exponential growth in electricity demand and raising serious concerns about sustainability, carbon emissions, and infrastructure stress. On the other hand, AI holds the promise of becoming a key enabler of the very sustainability goals it threatens by optimizing demand forecasting, integrating renewables, strengthening grid resilience, and improving decision-making across the energy value chain.
This duality underscores the need for a holistic framework that addresses both sides of the energy–AI nexus:
-
Demand side: Development of greener, more efficient AI models and architectures;
-
Supply side: Decarbonized and secure energy provision for AI infrastructures, combining renewables, storage, and nuclear options;
-
System side: Application of AI, including LLMs, to enhance energy efficiency, security, and resilience.
The path forward requires a coordinated effort across research, industry, and policy. Researchers must prioritize algorithmic and hardware innovations that reduce the energy footprint of AI. Energy providers must align AI-driven demand growth with clean and resilient energy systems. Policymakers must ensure that governance structures incentivize both technological progress and environmental responsibility.
Ultimately, whether AI becomes a driver of sustainable transformation or a burden on global energy systems will depend on our ability to reconcile its energy hunger with the imperatives of decarbonization and resilience. By approaching AI not only as a consumer but also as a contributor to energy sustainability, society can harness its potential to accelerate, rather than hinder, the clean energy transition.

Funding

This work was supported by the Operational Programme “Research, Innovation and Digitalisation for Smart Transformation 2021–2027” under Project NoBG16RFPR002-1.014-0006-C01 “National center of excellence for mechatronics and clean technologies”.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Franki, V.; Majnarić, D.; Višković, A. A Comprehensive Review of Artificial Intelligence (AI) Companies in the Power Sector. 2023. Energies 2023, 16, 1077. [Google Scholar] [CrossRef]
  2. Chen, M.; Cui, H.; Blaabjerg, F.; Lorenz, L.; Hellinger, R.; Gray, T.; Fink, O.; Hermanns, K. Power for AI and AI for Power: The Infinite Entanglement Between Artificial Intelligence and Power Electronics Systems. IEEE Power Electron. Mag. 2025, 12, 37–43. [Google Scholar] [CrossRef]
  3. Han, S.; Wang, M.; Zhang, J.; Li, D.; Duan, J. A Review of Large Language Models: Fundamental Architectures, Key Technological Evolutions, Interdisciplinary Technologies Integration, Optimization and Compression Techniques, Applications, and Challenges. Electronics 2024, 13, 5040. [Google Scholar] [CrossRef]
  4. Raiaan, M.A.K.; Mukta, S.H.; Fatema, K.; Fahad, N.M.; Sakib, S.; Mim, M.M.J.; Ahmad, J.; Ali, M.E.; Azam, S. A Review on Large Language Models: Architectures, Applications, Taxonomies, Open Issues and Challenges. IEEE Access 2024, 12, 26839–26874. [Google Scholar] [CrossRef]
  5. Li, J.; Herdem, M.S.; Nathwani, J.; Wen, J.Z. Methods and applications for Artificial Intelligence, Big Data, Internet of Things, and Blockchain in smart energy management. Energy AI 2023, 11, 100208. [Google Scholar] [CrossRef]
  6. Kankanhalli, A. Peer Review in the Age of Generative AI. J. Assoc. Inf. Syst. 2024, 25, 76–84. [Google Scholar] [CrossRef]
  7. Fang, X.; Che, S.; Mao, M.; Zhang, H.; Zhao, M.; Zhao, X. Bias of AI-generated content: An examination of news produced by large language models. Sci. Rep. 2024, 14, 5224. [Google Scholar] [CrossRef] [PubMed]
  8. Rajendran, B. Building a Smart and Green AI. Electrochem. Soc. Interface 2023, 32, 47–48. [Google Scholar] [CrossRef]
  9. Patterson, D.; Gonzalez, J.; Le, Q.V.; Liang, C.; Munguia, L.-M.; Rothchild, D.; So, D.; Texier, M.; Dean, J. Carbon Emissions and Large Neural Network Training. arXiv 2021, arXiv:2104.10350. [Google Scholar] [CrossRef]
  10. International Energy Agency (IEA). Data Centres and Data Transmission Networks; IEA: Paris, France, 2023; Available online: https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks (accessed on 22 August 2025).
  11. International Energy Agency (IEA). AI Is Set to Drive Surging Electricity Demand from Data Centres While Offering the Potential to Transform How the Energy Sector Works; News Release, 10 April 2025. Available online: https://www.iea.org/news/ai-is-set-to-drive-surging-electricity-demand-from-data-centres-while-offering-the-potential-to-transform-how-the-energy-sector-works (accessed on 22 August 2025).
  12. The Green Grid. PUE™: A Comprehensive Examination of the Metric; The Green Grid: Washington, DC, USA, 2014; Available online: https://www.thegreengrid.org/en/resources/library-and-tools/20-PUE%3A-A-Comprehensive-Examination-of-the-Metric (accessed on 22 August 2025).
  13. The Green Grid. WP#35—Water Usage Effectiveness (WUE): A Green Grid Data Center Sustainability Metric; The Green Grid: Washington, DC, USA, 2011; Available online: https://www.thegreengrid.org/en/resources/library-and-tools/238-WP%2335---Water-Usage-Effectiveness-%28WUE%29%3A-A-Green-Grid-Data-Center-Sustainability-Metric (accessed on 22 August 2025).
  14. Li, P.; Yang, J.; Islam, M.A.; Ren, S. Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models. arXiv 2023, arXiv:2304.03271. [Google Scholar] [CrossRef]
  15. Hu, E.J.; Shen, Y.; Wallis, P.; Allen-Zhu, Z.; Li, Y.; Wang, L.; Wang, W.; Chen, W. LoRA: Low-Rank Adaptation of Large Language Models. arXiv 2021, arXiv:2106.09685. [Google Scholar] [CrossRef]
  16. U.S. Department of Energy (DOE); National Renewable Energy Laboratory (NREL). Best Practices Guide for Energy-Efficient Data Center Design; DOE: Washington, DC, USA, 2024. Available online: https://www.energy.gov/sites/default/files/2024-07/best-practice-guide-data-center-design_0.pdf (accessed on 22 August 2025).
  17. ASHRAE TC 9.9. In Thermal Guidelines for Data Processing Environments, 5th ed.; ASHRAE: Atlanta, GA, USA, 2021; Available online: https://www.ashrae.org/file%20library/technical%20resources/bookstore/supplemental%20files/therm-gdlns-5th-r-e-refcard.pdf (accessed on 22 August 2025).
  18. European Commission, Joint Research Centre (JRC). 2025 Best Practice Guidelines for the EU Code of Conduct on Data Centre Energy Efficiency; Publications Office of the EU: Luxembourg, 2025; Available online: https://publications.jrc.ec.europa.eu/repository/handle/JRC141521 (accessed on 22 August 2025).
  19. CIP-002-5.1a; Cyber Security: BES Cyber System Categorization. North American Electric Reliability Corporation (NERC): Atlanta, GA, USA, 2016. Available online: https://www.nerc.com/pa/stand/reliability%20standards/cip-002-5.1a.pdf (accessed on 22 August 2025).
  20. European Union. Directive (EU) 2022/2555 (NIS 2 Directive) on Measures for a High Common Level of Cybersecurity across the Union; Official Journal of the European Union L 333. 27 December 2022. pp. 80–152. Available online: https://eur-lex.europa.eu/eli/dir/2022/2555/oj/eng (accessed on 22 August 2025).
  21. ISO/IEC 27001:2022; Information Security Management Systems—Requirements. International Organization for Standardization: Geneva, Switzerland, 2022. Available online: https://www.iso.org/standard/27001 (accessed on 22 August 2025).
  22. Hong, T.; Fan, S. Probabilistic electric load forecasting: A tutorial review. Int. J. Forecast. 2016, 32, 914–938. [Google Scholar] [CrossRef]
  23. Lewis, P.; Perez, E.; Piktus, A.; Petroni, F.; Karpukhin, V.; Goyal, N.; Küttler, H.; Lewis, M.; Yih, W.-T.; Rocktäschel, T.; et al. Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks. arXiv 2020, arXiv:2005.11401. [Google Scholar] [CrossRef]
  24. Schwartz, R.; Dodge, J.; Smith, N.A.; Etzioni, O. Green AI. Commun. ACM 2020, 63, 54–63. [Google Scholar] [CrossRef]
  25. Hinton, G.; Vinyals, O.; Dean, J. Distilling the Knowledge in a Neural Network. arXiv 2015, arXiv:1503.02531. [Google Scholar] [CrossRef]
  26. Dettmers, T.; Lewis, M.; Belkada, Y.; Zettlemoyer, L. LLM.int8(): 8-bit Matrix Multiplication for Transformers at Scale. arXiv 2022, arXiv:2208.07339. [Google Scholar] [CrossRef]
  27. Frantar, E.; Ashkboos, S.; Alistarh, D.; Hoefler, T. GPTQ: Accurate Post-Training Quantization for Generative Pretrained Transformers. arXiv 2022, arXiv:2210.17323. [Google Scholar] [CrossRef]
  28. Fedus, W.; Zoph, B.; Shazeer, N. Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity. arXiv 2021, arXiv:2101.03961. [Google Scholar] [CrossRef]
  29. Verdecchia, R.; Sallou, J.; Cruz, L. A systematic review of Green AI. WIREs Data Min. Knowl. Discov. 2023, 13, e1507. [Google Scholar] [CrossRef]
  30. Bolón-Canedo, V.; Morán-Fernández, L.; Cancela, B.; Alonso-Betanzos, A. A review of green artificial intelligence: Towards a more sustainable future. Neurocomputing 2024, 599, 128096. [Google Scholar] [CrossRef]
  31. Tang, Y.; Cao, D.; Xiao, J.; Jiang, C.; Huang, Q.; Li, Y.; Chen, Z.; Blaabjerg, F.; Hu, W. AI-aided power electronic converters automatic online real-time efficiency optimization method. Fundam. Res. 2023, 5, 1111–1116. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hinov, N. The Energy Hunger of AI: Large Language Models as Challenges and Enablers for Sustainable Energy. Energies 2025, 18, 4701. https://doi.org/10.3390/en18174701

AMA Style

Hinov N. The Energy Hunger of AI: Large Language Models as Challenges and Enablers for Sustainable Energy. Energies. 2025; 18(17):4701. https://doi.org/10.3390/en18174701

Chicago/Turabian Style

Hinov, Nikolay. 2025. "The Energy Hunger of AI: Large Language Models as Challenges and Enablers for Sustainable Energy" Energies 18, no. 17: 4701. https://doi.org/10.3390/en18174701

APA Style

Hinov, N. (2025). The Energy Hunger of AI: Large Language Models as Challenges and Enablers for Sustainable Energy. Energies, 18(17), 4701. https://doi.org/10.3390/en18174701

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop