Forecasting emissions through Kaya identity using Neural Ordinary
Differential Equations
- URL: http://arxiv.org/abs/2201.02433v1
- Date: Fri, 7 Jan 2022 12:34:01 GMT
- Title: Forecasting emissions through Kaya identity using Neural Ordinary
Differential Equations
- Authors: Pierre Browne, Aranildo Lima, Rossella Arcucci, C\'esar
Quilodr\'an-Casas
- Abstract summary: We use a Neural ODE model to predict the evolution of several indicators related to carbon emissions on a country-level.
We conclude that this machine-learning approach can be used to produce a wide range of results and give relevant insight to policymakers.
- Score: 3.4901787251083163
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Starting from the Kaya identity, we used a Neural ODE model to predict the
evolution of several indicators related to carbon emissions, on a
country-level: population, GDP per capita, energy intensity of GDP, carbon
intensity of energy. We compared the model with a baseline statistical model -
VAR - and obtained good performances. We conclude that this machine-learning
approach can be used to produce a wide range of results and give relevant
insight to policymakers
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - CarbonSense: A Multimodal Dataset and Baseline for Carbon Flux Modelling [9.05128569357374]
We present CarbonSense, the first machine learning-ready dataset for data-driven carbon flux modelling.
Our experiments illustrate the potential gains that multimodal deep learning techniques can bring to this domain.
arXiv Detail & Related papers (2024-06-07T13:47:40Z) - OpenCarbonEval: A Unified Carbon Emission Estimation Framework in Large-Scale AI Models [16.93272879722972]
OpenCarbonEval is a framework for integrating large-scale models across diverse modalities to predict carbon emissions.
We show that OpenCarbonEval achieves superior performance in predicting carbon emissions for both visual models and language models.
arXiv Detail & Related papers (2024-05-21T14:50:20Z) - A Comparative Study of Machine Learning Algorithms for Anomaly Detection
in Industrial Environments: Performance and Environmental Impact [62.997667081978825]
This study seeks to address the demands of high-performance machine learning models with environmental sustainability.
Traditional machine learning algorithms, such as Decision Trees and Random Forests, demonstrate robust efficiency and performance.
However, superior outcomes were obtained with optimised configurations, albeit with a commensurate increase in resource consumption.
arXiv Detail & Related papers (2023-07-01T15:18:00Z) - On Feature Diversity in Energy-based Models [98.78384185493624]
An energy-based model (EBM) is typically formed of inner-model(s) that learn a combination of the different features to generate an energy mapping for each input configuration.
We extend the probably approximately correct (PAC) theory of EBMs and analyze the effect of redundancy reduction on the performance of EBMs.
arXiv Detail & Related papers (2023-06-02T12:30:42Z) - Forecasting Intraday Power Output by a Set of PV Systems using Recurrent Neural Networks and Physical Covariates [0.0]
Accurate forecasts of the power output by PhotoVoltaic (PV) systems are critical to improve the operation of energy distribution grids.
We describe a neural autoregressive model that aims to perform such intraday forecasts.
arXiv Detail & Related papers (2023-03-15T09:03:58Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - Energy Efficiency of Training Neural Network Architectures: An Empirical
Study [11.325530936177493]
The evaluation of Deep Learning models has traditionally focused on criteria such as accuracy, F1 score, and related measures.
The computations needed to train such models entail a large carbon footprint.
We study the relations between DL model architectures and their environmental impact in terms of energy consumed and CO$$ emissions produced during training.
arXiv Detail & Related papers (2023-02-02T09:20:54Z) - Latent Variable Representation for Reinforcement Learning [131.03944557979725]
It remains unclear theoretically and empirically how latent variable models may facilitate learning, planning, and exploration to improve the sample efficiency of model-based reinforcement learning.
We provide a representation view of the latent variable models for state-action value functions, which allows both tractable variational learning algorithm and effective implementation of the optimism/pessimism principle.
In particular, we propose a computationally efficient planning algorithm with UCB exploration by incorporating kernel embeddings of latent variable models.
arXiv Detail & Related papers (2022-12-17T00:26:31Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Curb Your Carbon Emissions: Benchmarking Carbon Emissions in Machine
Translation [0.0]
We study the carbon efficiency and look for alternatives to reduce the overall environmental impact of training models.
In our work, we assess the performance of models for machine translation, across multiple language pairs.
We examine the various components of these models to analyze aspects of our pipeline that can be optimized to reduce these carbon emissions.
arXiv Detail & Related papers (2021-09-26T12:30:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.