Carbon Footprint of Selecting and Training Deep Learning Models for
Medical Image Analysis
- URL: http://arxiv.org/abs/2203.02202v1
- Date: Fri, 4 Mar 2022 09:22:47 GMT
- Title: Carbon Footprint of Selecting and Training Deep Learning Models for
Medical Image Analysis
- Authors: Raghavendra Selvan, Nikhil Bhagwat, Lasse F. Wolff Anthony, Benjamin
Kanding, Erik B. Dam
- Abstract summary: We focus on the carbon footprint of developing deep learning models for medical image analysis (MIA)
We present and compare the features of four tools to quantify the carbon footprint of DL.
We discuss simple strategies to cut-down the environmental impact that can make model selection and training processes more efficient.
- Score: 0.2936007114555107
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The increasing energy consumption and carbon footprint of deep learning (DL)
due to growing compute requirements has become a cause of concern. In this
work, we focus on the carbon footprint of developing DL models for medical
image analysis (MIA), where volumetric images of high spatial resolution are
handled. In this study, we present and compare the features of four tools from
literature to quantify the carbon footprint of DL. Using one of these tools we
estimate the carbon footprint of medical image segmentation pipelines. We
choose nnU-net as the proxy for a medical image segmentation pipeline and
experiment on three common datasets. With our work we hope to inform on the
increasing energy costs incurred by MIA. We discuss simple strategies to
cut-down the environmental impact that can make model selection and training
processes more efficient.
Related papers
- EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis [17.876140405367764]
This study presents an Efficient Fine-tuning on Compressed Models (EFCM) framework with two stages: unsupervised feature distillation and fine-tuning.
Experiments are conducted on 11 downstream datasets related to three large medical models: RETFound for retina, MRM for chest X-ray, and BROW for histopathology.
arXiv Detail & Related papers (2024-09-18T09:08:16Z) - End-to-End Model-based Deep Learning for Dual-Energy Computed Tomography Material Decomposition [53.14236375171593]
We propose a deep learning procedure called End-to-End Material Decomposition (E2E-DEcomp) for quantitative material decomposition.
We show the effectiveness of the proposed direct E2E-DEcomp method on the AAPM spectral CT dataset.
arXiv Detail & Related papers (2024-06-01T16:20:59Z) - IoTCO2: Assessing the End-To-End Carbon Footprint of Internet-of-Things-Enabled Deep Learning [6.582643137531881]
Deep learning (DL) models are increasingly deployed on Internet of Things (IoT) devices for data processing.
carb is an end-to-end tool for precise carbon footprint estimation in IoT-enabled DL.
arXiv Detail & Related papers (2024-03-16T17:32:59Z) - Machine Guided Discovery of Novel Carbon Capture Solvents [48.7576911714538]
Machine learning offers a promising method for reducing the time and resource burdens of materials development.
We have developed an end-to-end "discovery cycle" to select new aqueous amines compatible with the commercially viable acid gas scrubbing carbon capture.
The prediction process shows 60% accuracy against experiment for both material parameters and 80% for a single parameter on an external test set.
arXiv Detail & Related papers (2023-03-24T18:32:38Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Incremental Cross-view Mutual Distillation for Self-supervised Medical
CT Synthesis [88.39466012709205]
This paper builds a novel medical slice to increase the between-slice resolution.
Considering that the ground-truth intermediate medical slices are always absent in clinical practice, we introduce the incremental cross-view mutual distillation strategy.
Our method outperforms state-of-the-art algorithms by clear margins.
arXiv Detail & Related papers (2021-12-20T03:38:37Z) - Leveraging Human Selective Attention for Medical Image Analysis with
Limited Training Data [72.1187887376849]
The selective attention mechanism helps the cognition system focus on task-relevant visual clues by ignoring the presence of distractors.
We propose a framework to leverage gaze for medical image analysis tasks with small training data.
Our method is demonstrated to achieve superior performance on both 3D tumor segmentation and 2D chest X-ray classification tasks.
arXiv Detail & Related papers (2021-12-02T07:55:25Z) - Curb Your Carbon Emissions: Benchmarking Carbon Emissions in Machine
Translation [0.0]
We study the carbon efficiency and look for alternatives to reduce the overall environmental impact of training models.
In our work, we assess the performance of models for machine translation, across multiple language pairs.
We examine the various components of these models to analyze aspects of our pipeline that can be optimized to reduce these carbon emissions.
arXiv Detail & Related papers (2021-09-26T12:30:10Z) - Carbontracker: Tracking and Predicting the Carbon Footprint of Training
Deep Learning Models [0.3441021278275805]
Machine learning (ML) may become a significant contributor to climate change if this exponential trend continues.
We propose that energy and carbon footprint of model development and training is reported alongside performance metrics using tools like Carbontracker.
arXiv Detail & Related papers (2020-07-06T20:24:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.