From Red AI to Green AI: A Unified Survey of Lifecycle Costs, Efficiency Techniques, and a Comprehensive Reporting Framework
Pinaki Bose , Independent Researcher, USAAbstract
The exponential growth of large-scale Artificial Intelligence (AI) models, or "Red AI" , has led to a 300,000-fold increase in computational demand since 2012 , raising significant environmental and sustainability concerns. While the high carbon cost of model training (e.g., GPT-3's estimated 550 metric tons of CO2e) is well-documented, this focus obscures the dominant environmental burden: model inference, which can account for up to 90% of a model's total lifecycle energy consumption. A critical research gap exists in the unified analysis of carbon cost versus performance metrics across this entire AI lifecycle. Furthermore, the field lacks a standardized, comprehensive framework for Green AI reporting, hampering transparent and verifiable comparisons. This paper addresses this gap through a systematic review and quantitative synthesis of Green AI. We systematically categorize and evaluate three pillars of technical optimization: (1) model compression, (2) hardware-aware AI, and (3) low-power inference techniques. This analysis reveals that high-level architectural choices—such as using general-purpose generative models for discriminative tasks—are orders of magnitude (e.g., 14.6x to 30x) less efficient than task-specific models. We also highlight a "measurement crisis," where common reporting tools like CodeCarbon underestimate true energy consumption by 20-40% compared to ground-truth measurements. We conclude by proposing a comprehensive, lifecycle-based Green AI reporting framework, designed to integrate with existing GHG and ISO standards. This framework mandates unified cost-performance metrics (e.g., CO2e/ inference / performance-unit) to enable transparent, verifiable, and-informed decision-making for sustainable AI development.
Keywords
Green AI, Sustainable AI, Energy-Efficient AI, Model Compression, Hardware-Aware AI, Carbon Footprint, AI Lifecycle Assessment, LLM
References
General Purpose AI Uses 20 to 30 Times More Energy than Task-Specific AI - Proof News, https://www.proofnews.org/general-purpose-ai-uses-20-to-30-times-more-energy-than-task-specific-ai/
How AI Uses Energy - Third Way, https://www.thirdway.org/memo/how-ai-uses-energy
How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference - arXiv, https://arxiv.org/pdf/2505.09598
Smarter sustainability: How technology can transform climate metrics and disclosure, https://ccli.ubc.ca/smarter-sustainability-how-technology-can-transform-climate-metrics-and-disclosure/
Measuring AI's Energy/Environmental Footprint to Access Impacts, https://fas.org/publication/measuring-and-standardizing-ais-energy-footprint/
Lower Numerical Precision Deep Learning Inference and Training - Intel, https://www.intel.com/content/dam/develop/external/us/en/documents/lower-numerical-precision-deep-learning-jan2018-754765.pdf
PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation - arXiv, https://arxiv.org/abs/2106.14681
A Survey on Neural Network Hardware Accelerators - IEEE Computer Society, https://www.computer.org/csdl/journal/ai/2024/08/10472723/1ViYSMvUFI4
Inference, Low-Cost Models, and Compression - CS@Cornell, https://www.cs.cornell.edu/courses/cs6787/2018fa/Lecture11.pdf
Edge Intelligence: A Review of Deep Neural Network Inference in ..., https://www.mdpi.com/2079-9292/14/12/2495
Towards a Methodology and Framework for AI Sustainability Metrics - HotCarbon, https://hotcarbon.org/assets/2023/pdf/a13-eilam.pdf
he Hidden Cost of an Image: Quantifying the Energy Consumption of AI Image Generation, https://www.researchgate.net/publication/392918101_The_Hidden_Cost_of_an_Image_Quantifying_the_Energy_Consumption_of_AI_Image_Generation
[2506.17016] The Hidden Cost of an Image: Quantifying the Energy Consumption of AI Image Generation - arXiv, https://arxiv.org/abs/2506.17016
The Hidden Cost of an Image: Quantifying the Energy Consumption of AI Image Generation, https://arxiv.org/html/2506.17016v1
Comparative analysis of model compression techniques for ..., https://pubmed.ncbi.nlm.nih.gov/40604122/
Energy-Efficient Transformer Inference: Optimization Strategies for Time Series Classification - arXiv, https://arxiv.org/html/2502.16627v4
Energy-Efficient Transformer Inference: Optimization Strategies for Time Series Classification - arXiv, https://arxiv.org/pdf/2502.16627
Reducing Carbon Footprint of Machine Learning Through Model ..., https://www.ijisrt.com/assets/upload/files/IJISRT25AUG970.pdf
mlco2/codecarbon: Track emissions from Compute and recommend ways to reduce their impact on the environment. - GitHub, https://github.com/mlco2/codecarbon
How to estimate and reduce the carbon footprint of machine learning models, https://towardsdatascience.com/how-to-estimate-and-reduce-the-carbon-footprint-of-machine-learning-models-49f24510880/
Ground-Truthing AI Energy Consumption: Validating CodeCarbon Against External Measurements - arXiv, https://arxiv.org/pdf/2509.22092
Life-Cycle Emissions of AI Hardware: A Cradle-To-Grave Approach and Generational Trends - arXiv, https://arxiv.org/html/2502.01671v1
Sustain AI: A Multi-Modal Deep Learning Framework for Carbon Footprint Reduction in Industrial Manufacturing - MDPI, https://www.mdpi.com/2071-1050/17/9/4134
Criteria for Credible AI-assisted Carbon Footprinting Systems: The Cases of Mapping and Lifecycle Modeling - arXiv, https://arxiv.org/html/2509.00240v1
Article Statistics
Copyright License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors retain the copyright of their manuscripts, and all Open Access articles are disseminated under the terms of the Creative Commons Attribution License 4.0 (CC-BY), which licenses unrestricted use, distribution, and reproduction in any medium, provided that the original work is appropriately cited. The use of general descriptive names, trade names, trademarks, and so forth in this publication, even if not specifically identified, does not imply that these names are not protected by the relevant laws and regulations.


Engineering and Technology
| Open Access |
DOI: