A Multi-Objective Approach for Sustainable Generative Audio Models
- URL: http://arxiv.org/abs/2107.02621v1
- Date: Tue, 6 Jul 2021 13:52:27 GMT
- Title: A Multi-Objective Approach for Sustainable Generative Audio Models
- Authors: Constance Douwes, Philippe Esling and Jean-Pierre Briot
- Abstract summary: In recent years, the deep learning community has largely focused on the accuracy of deep generative models.
This scientific race for quality comes at a tremendous computational cost.
If the current exponential growth of computational consumption persists, Artificial Intelligence will sadly become a considerable contributor to global warming.
- Score: 1.933681537640272
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In recent years, the deep learning community has largely focused on the
accuracy of deep generative models, resulting in impressive improvements in
several research fields. However, this scientific race for quality comes at a
tremendous computational cost, which incurs vast energy consumption and
greenhouse gas emissions. If the current exponential growth of computational
consumption persists, Artificial Intelligence (AI) will sadly become a
considerable contributor to global warming.
At the heart of this problem are the measures that we use as a scientific
community to evaluate our work. Currently, researchers in the field of AI judge
scientific works mostly based on the improvement in accuracy, log-likelihood,
reconstruction or opinion scores, all of which entirely obliterates the actual
computational cost of generative models.
In this paper, we introduce the idea of relying on a multi-objective measure
based on Pareto optimality, which simultaneously integrates the models
accuracy, as well as the environmental impact of their training. By applying
this measure on the current state-of-the-art in generative audio models, we
show that this measure drastically changes the perceived significance of the
results in the field, encouraging optimal training techniques and resource
allocation. We hope that this type of measure will be widely adopted, in order
to help the community to better evaluate the significance of their work, while
bringing computational cost -- and in fine carbon emissions -- in the spotlight
of AI research.
Related papers
- Hype, Sustainability, and the Price of the Bigger-is-Better Paradigm in AI [67.58673784790375]
We argue that the 'bigger is better' AI paradigm is not only fragile scientifically, but comes with undesirable consequences.
First, it is not sustainable, as its compute demands increase faster than model performance, leading to unreasonable economic requirements and a disproportionate environmental footprint.
Second, it implies focusing on certain problems at the expense of others, leaving aside important applications, e.g. health, education, or the climate.
arXiv Detail & Related papers (2024-09-21T14:43:54Z) - Reporting and Analysing the Environmental Impact of Language Models on the Example of Commonsense Question Answering with External Knowledge [7.419725234099729]
ChatGPT sparked social interest in Large Language Models (LLMs)
LLMs demand substantial computational resources and are very costly to train, both financially and environmentally.
In this study, we infused T5 LLM with external knowledge and fine-tuned the model for Question-Answering task.
arXiv Detail & Related papers (2024-07-24T16:16:16Z) - On the Opportunities of Green Computing: A Survey [80.21955522431168]
Artificial Intelligence (AI) has achieved significant advancements in technology and research with the development over several decades.
The needs for high computing power brings higher carbon emission and undermines research fairness.
To tackle the challenges of computing resources and environmental impact of AI, Green Computing has become a hot research topic.
arXiv Detail & Related papers (2023-11-01T11:16:41Z) - Computation-efficient Deep Learning for Computer Vision: A Survey [121.84121397440337]
Deep learning models have reached or even exceeded human-level performance in a range of visual perception tasks.
Deep learning models usually demand significant computational resources, leading to impractical power consumption, latency, or carbon emissions in real-world scenarios.
New research focus is computationally efficient deep learning, which strives to achieve satisfactory performance while minimizing the computational cost during inference.
arXiv Detail & Related papers (2023-08-27T03:55:28Z) - A Comparative Study of Machine Learning Algorithms for Anomaly Detection
in Industrial Environments: Performance and Environmental Impact [62.997667081978825]
This study seeks to address the demands of high-performance machine learning models with environmental sustainability.
Traditional machine learning algorithms, such as Decision Trees and Random Forests, demonstrate robust efficiency and performance.
However, superior outcomes were obtained with optimised configurations, albeit with a commensurate increase in resource consumption.
arXiv Detail & Related papers (2023-07-01T15:18:00Z) - A Green(er) World for A.I [9.330274375369802]
This paper outlines our outlook for Green A.I. -- a more sustainable, energy-efficient and energy-aware ecosystem.
We present a bird's eye view of various areas for potential changes and improvements.
We hope these points will spur further discussion, and action, on some of these issues and their potential solutions.
arXiv Detail & Related papers (2023-01-27T08:01:38Z) - Eco2AI: carbon emissions tracking of machine learning models as the
first step towards sustainable AI [47.130004596434816]
In eco2AI we put emphasis on accuracy of energy consumption tracking and correct regional CO2 emissions accounting.
The motivation also comes from the concept of AI-based green house gases sequestrating cycle with both Sustainable AI and Green AI pathways.
arXiv Detail & Related papers (2022-07-31T09:34:53Z) - From Convolutions towards Spikes: The Environmental Metric that the
Community currently Misses [3.498371632913735]
We show that currently used ANNs are not what we find in nature, and why, although having lower performance, spiking neural networks have attracted much interest.
We highlight the hardware gaps restricting the researchers from using spike-based computation for developing neuromorphic energy-efficient microchips.
We also define a new evaluation metric 'NATURE' for reporting the carbon footprint of AI models.
arXiv Detail & Related papers (2021-11-16T11:04:42Z) - A Survey on Green Deep Learning [25.71572024291251]
This paper focuses on presenting a systematic review of the development of Green deep learning technologies.
We classify these approaches into four categories: (1) compact networks, (2) energy-efficient training strategies, (3) energy-efficient inference approaches, and (4) efficient data usage.
arXiv Detail & Related papers (2021-11-08T16:55:03Z) - Compute and Energy Consumption Trends in Deep Learning Inference [67.32875669386488]
We study relevant models in the areas of computer vision and natural language processing.
For a sustained increase in performance we see a much softer growth in energy consumption than previously anticipated.
arXiv Detail & Related papers (2021-09-12T09:40:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.