From Clicks to Carbon: The Environmental Toll of Recommender Systems
- URL: http://arxiv.org/abs/2408.08203v2
- Date: Thu, 22 Aug 2024 10:14:33 GMT
- Title: From Clicks to Carbon: The Environmental Toll of Recommender Systems
- Authors: Tobias Vente, Lukas Wegmeth, Alan Said, Joeran Beel,
- Abstract summary: We estimate the environmental impact of recommender systems research by reproducing typical experimental pipelines.
Our analysis spans 79 full papers from the 2013 and 2023 ACM RecSys conferences.
On average, a single deep learning-based paper generates 3,297 kilograms of CO2 equivalents.
- Score: 0.24374097382908472
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: As global warming soars, the need to assess the environmental impact of research is becoming increasingly urgent. Despite this, few recommender systems research papers address their environmental impact. In this study, we estimate the environmental impact of recommender systems research by reproducing typical experimental pipelines. Our analysis spans 79 full papers from the 2013 and 2023 ACM RecSys conferences, comparing traditional "good old-fashioned AI" algorithms with modern deep learning algorithms. We designed and reproduced representative experimental pipelines for both years, measuring energy consumption with a hardware energy meter and converting it to CO2 equivalents. Our results show that papers using deep learning algorithms emit approximately 42 times more CO2 equivalents than papers using traditional methods. On average, a single deep learning-based paper generates 3,297 kilograms of CO2 equivalents - more than the carbon emissions of one person flying from New York City to Melbourne or the amount of CO2 one tree sequesters over 300 years.
Related papers
- Green Recommender Systems: Understanding and Minimizing the Carbon Footprint of AI-Powered Personalization [0.05249805590164902]
We examine the environmental impact of recommender systems research by reproducing typical experimental pipelines.<n>Based on our results, we provide guidelines for researchers and practitioners on how to minimize the environmental footprint of their work.
arXiv Detail & Related papers (2025-09-16T12:13:31Z) - How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference [0.0]
This paper introduces a novel infrastructure-aware benchmarking framework for quantifying the environmental footprint of AI inference across 30 state-of-the-art models as deployed in commercial data centers.<n>Our results show that o3 and DeepSeek-R1 emerge as the most energy-intensive models, consuming over 33 Wh per long prompt, more than 70 times the consumption of GPT-4.1 nano, and that Claude-3.7 Sonnet ranks highest in eco-efficiency.<n>These findings illustrate a growing paradox: Although AI is becoming cheaper and faster, its global adoption drives disproportionate resource consumption.
arXiv Detail & Related papers (2025-05-14T17:47:00Z) - A Comprehensive Approach to Carbon Dioxide Emission Analysis in High Human Development Index Countries using Statistical and Machine Learning Techniques [4.106914713812204]
It's imperative to forecast CO2 emission trends and classify countries based on their emission patterns to effectively mitigate worldwide carbon emission.
This paper presents an in-depth comparative study on the determinants of CO2 emission in twenty countries with high Human Development Index (HDI), exploring factors related to economy, environment, energy use, and renewable resources over a span of 25 years.
arXiv Detail & Related papers (2024-05-01T21:00:02Z) - Machine Guided Discovery of Novel Carbon Capture Solvents [48.7576911714538]
Machine learning offers a promising method for reducing the time and resource burdens of materials development.
We have developed an end-to-end "discovery cycle" to select new aqueous amines compatible with the commercially viable acid gas scrubbing carbon capture.
The prediction process shows 60% accuracy against experiment for both material parameters and 80% for a single parameter on an external test set.
arXiv Detail & Related papers (2023-03-24T18:32:38Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - Carbon Emission Prediction on the World Bank Dataset for Canada [0.9256577986166795]
This paper provides the methods for predicting carbon emissions (CO2 emissions) for the next few years.
The predictions are based on data from the past 50 years.
This dataset contains CO2 emissions (metric tons per capita) of all the countries from 1960 to 2018.
arXiv Detail & Related papers (2022-11-26T07:04:52Z) - A Comparative Study of Machine Learning and Deep Learning Techniques for
Prediction of Co2 Emission in Cars [2.362412515574206]
There is mounting evidence that the CO2 numbers supplied by the government do not accurately reflect the performance of automobiles on the road.
To determine which algorithms and models produce the greatest outcomes, we compared them all and explored a novel method of ensembling them.
This can be used to foretell the rise in global temperature and to ground crucial policy decisions like the adoption of electric vehicles.
arXiv Detail & Related papers (2022-11-15T16:20:39Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Eco2AI: carbon emissions tracking of machine learning models as the
first step towards sustainable AI [47.130004596434816]
In eco2AI we put emphasis on accuracy of energy consumption tracking and correct regional CO2 emissions accounting.
The motivation also comes from the concept of AI-based green house gases sequestrating cycle with both Sustainable AI and Green AI pathways.
arXiv Detail & Related papers (2022-07-31T09:34:53Z) - Joint Study of Above Ground Biomass and Soil Organic Carbon for Total
Carbon Estimation using Satellite Imagery in Scotland [0.0]
Land Carbon verification has long been a challenge in the carbon credit market.
Remote sensing techniques enable new approaches to monitor changes in Above Ground Biomass (AGB) and Soil Organic Carbon (SOC)
arXiv Detail & Related papers (2022-05-08T20:23:30Z) - Optimizing carbon tax for decentralized electricity markets using an
agent-based model [69.3939291118954]
Averting the effects of anthropogenic climate change requires a transition from fossil fuels to low-carbon technology.
Carbon taxes have been shown to be an efficient way to aid in this transition.
We use the NSGA-II genetic algorithm to minimize average electricity price and relative carbon intensity of the electricity mix.
arXiv Detail & Related papers (2020-05-28T06:54:43Z) - Towards the Systematic Reporting of the Energy and Carbon Footprints of
Machine Learning [68.37641996188133]
We introduce a framework for tracking realtime energy consumption and carbon emissions.
We create a leaderboard for energy efficient reinforcement learning algorithms.
We propose strategies for mitigation of carbon emissions and reduction of energy consumption.
arXiv Detail & Related papers (2020-01-31T05:12:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.