Green Recommender Systems: Understanding and Minimizing the Carbon Footprint of AI-Powered Personalization
- URL: http://arxiv.org/abs/2509.13001v1
- Date: Tue, 16 Sep 2025 12:13:31 GMT
- Title: Green Recommender Systems: Understanding and Minimizing the Carbon Footprint of AI-Powered Personalization
- Authors: Lukas Wegmeth, Tobias Vente, Alan Said, Joeran Beel,
- Abstract summary: We examine the environmental impact of recommender systems research by reproducing typical experimental pipelines.<n>Based on our results, we provide guidelines for researchers and practitioners on how to minimize the environmental footprint of their work.
- Score: 0.05249805590164902
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: As global warming soars, the need to assess and reduce the environmental impact of recommender systems is becoming increasingly urgent. Despite this, the recommender systems community hardly understands, addresses, and evaluates the environmental impact of their work. In this study, we examine the environmental impact of recommender systems research by reproducing typical experimental pipelines. Based on our results, we provide guidelines for researchers and practitioners on how to minimize the environmental footprint of their work and implement green recommender systems - recommender systems designed to minimize their energy consumption and carbon footprint. Our analysis covers 79 papers from the 2013 and 2023 ACM RecSys conferences, comparing traditional "good old-fashioned AI" models with modern deep learning models. We designed and reproduced representative experimental pipelines for both years, measuring energy consumption using a hardware energy meter and converting it into CO2 equivalents. Our results show that papers utilizing deep learning models emit approximately 42 times more CO2 equivalents than papers using traditional models. On average, a single deep learning-based paper generates 2,909 kilograms of CO2 equivalents - more than the carbon emissions of a person flying from New York City to Melbourne or the amount of CO2 sequestered by one tree over 260 years. This work underscores the urgent need for the recommender systems and wider machine learning communities to adopt green AI principles, balancing algorithmic advancements and environmental responsibility to build a sustainable future with AI-powered personalization.
Related papers
- The Hidden AI Race: Tracking Environmental Costs of Innovation [2.5782420501870296]
We study the amount of carbon dioxide released by models across different domains over varying time periods.<n>Our findings reveal that model size and versioning frequency are strongly correlated with higher emissions.<n>University-driven projects exhibit the highest emissions, followed by non-profits and companies.
arXiv Detail & Related papers (2025-11-27T22:14:43Z) - How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference [0.0]
This paper introduces a novel infrastructure-aware benchmarking framework for quantifying the environmental footprint of AI inference across 30 state-of-the-art models as deployed in commercial data centers.<n>Our results show that o3 and DeepSeek-R1 emerge as the most energy-intensive models, consuming over 33 Wh per long prompt, more than 70 times the consumption of GPT-4.1 nano, and that Claude-3.7 Sonnet ranks highest in eco-efficiency.<n>These findings illustrate a growing paradox: Although AI is becoming cheaper and faster, its global adoption drives disproportionate resource consumption.
arXiv Detail & Related papers (2025-05-14T17:47:00Z) - From Clicks to Carbon: The Environmental Toll of Recommender Systems [0.24374097382908472]
We estimate the environmental impact of recommender systems research by reproducing typical experimental pipelines.
Our analysis spans 79 full papers from the 2013 and 2023 ACM RecSys conferences.
On average, a single deep learning-based paper generates 3,297 kilograms of CO2 equivalents.
arXiv Detail & Related papers (2024-08-15T15:11:06Z) - Generative AI for Low-Carbon Artificial Intelligence of Things with Large Language Models [67.0243099823109]
Generative AI (GAI) holds immense potential to reduce carbon emissions of Artificial Intelligence of Things (AIoT)
In this article, we explore the potential of GAI for carbon emissions reduction and propose a novel GAI-enabled solution for low-carbon AIoT.
We propose a Large Language Model (LLM)-enabled carbon emission optimization framework, in which we design pluggable LLM and Retrieval Augmented Generation (RAG) modules.
arXiv Detail & Related papers (2024-04-28T05:46:28Z) - Green AI: Exploring Carbon Footprints, Mitigation Strategies, and Trade Offs in Large Language Model Training [9.182429523979598]
We evaluate the CO2 emissions of well-known large language models, which have an especially high carbon footprint due to their significant amount of model parameters.
We argue for the training of LLMs in a way that is responsible and sustainable by suggesting measures for reducing carbon emissions.
arXiv Detail & Related papers (2024-04-01T15:01:45Z) - A Comparative Study of Machine Learning Algorithms for Anomaly Detection
in Industrial Environments: Performance and Environmental Impact [62.997667081978825]
This study seeks to address the demands of high-performance machine learning models with environmental sustainability.
Traditional machine learning algorithms, such as Decision Trees and Random Forests, demonstrate robust efficiency and performance.
However, superior outcomes were obtained with optimised configurations, albeit with a commensurate increase in resource consumption.
arXiv Detail & Related papers (2023-07-01T15:18:00Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Eco2AI: carbon emissions tracking of machine learning models as the
first step towards sustainable AI [47.130004596434816]
In eco2AI we put emphasis on accuracy of energy consumption tracking and correct regional CO2 emissions accounting.
The motivation also comes from the concept of AI-based green house gases sequestrating cycle with both Sustainable AI and Green AI pathways.
arXiv Detail & Related papers (2022-07-31T09:34:53Z) - Towards the Systematic Reporting of the Energy and Carbon Footprints of
Machine Learning [68.37641996188133]
We introduce a framework for tracking realtime energy consumption and carbon emissions.
We create a leaderboard for energy efficient reinforcement learning algorithms.
We propose strategies for mitigation of carbon emissions and reduction of energy consumption.
arXiv Detail & Related papers (2020-01-31T05:12:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.