Photonics for Sustainable Computing
- URL: http://arxiv.org/abs/2401.05121v1
- Date: Wed, 10 Jan 2024 12:37:23 GMT
- Title: Photonics for Sustainable Computing
- Authors: Farbin Fayza, Satyavolu Papa Rao, Darius Bunandar, Udit Gupta, Ajay
Joshi
- Abstract summary: Photonic integrated circuits are finding use in a variety of applications including optical transceivers, LIDAR, bio-sensing, photonic quantum computing, and Machine Learning.
In this paper, we build a carbon footprint model for photonic chips and investigate the sustainability of photonics-based accelerators.
Our analysis shows that photonics can reduce both operational and embodied carbon footprints with its high energy efficiency.
- Score: 1.7396686601746498
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Photonic integrated circuits are finding use in a variety of applications
including optical transceivers, LIDAR, bio-sensing, photonic quantum computing,
and Machine Learning (ML). In particular, with the exponentially increasing
sizes of ML models, photonics-based accelerators are getting special attention
as a sustainable solution because they can perform ML inferences with multiple
orders of magnitude higher energy efficiency than CMOS-based accelerators.
However, recent studies have shown that hardware manufacturing and
infrastructure contribute significantly to the carbon footprint of computing
devices, even surpassing the emissions generated during their use. For example,
the manufacturing process accounts for 74% of the total carbon emissions from
Apple in 2019. This prompts us to ask -- if we consider both the embodied
(manufacturing) and operational carbon cost of photonics, is it indeed a viable
avenue for a sustainable future? So, in this paper, we build a carbon footprint
model for photonic chips and investigate the sustainability of photonics-based
accelerators by conducting a case study on ADEPT, a photonics-based accelerator
for deep neural network inference. Our analysis shows that photonics can reduce
both operational and embodied carbon footprints with its high energy efficiency
and at least 4$\times$ less fabrication carbon cost per unit area than 28 nm
CMOS.
Related papers
- The Sunk Carbon Fallacy: Rethinking Carbon Footprint Metrics for Effective Carbon-Aware Scheduling [2.562727244613512]
We evaluate carbon-aware job scheduling and placement on a given set of servers for a number of carbon accounting metrics.
We study the factors that affect the added carbon cost of such suboptimal decision-making.
arXiv Detail & Related papers (2024-10-19T12:23:59Z) - Generative AI for Low-Carbon Artificial Intelligence of Things with Large Language Models [67.0243099823109]
Generative AI (GAI) holds immense potential to reduce carbon emissions of Artificial Intelligence of Things (AIoT)
In this article, we explore the potential of GAI for carbon emissions reduction and propose a novel GAI-enabled solution for low-carbon AIoT.
We propose a Large Language Model (LLM)-enabled carbon emission optimization framework, in which we design pluggable LLM and Retrieval Augmented Generation (RAG) modules.
arXiv Detail & Related papers (2024-04-28T05:46:28Z) - LLMCarbon: Modeling the end-to-end Carbon Footprint of Large Language
Models [7.132822974156601]
The carbon footprint of large language models (LLMs) is a significant concern, encompassing emissions from their training, inference, experimentation, and storage processes.
We introduce textitcarb, an end-to-end carbon footprint projection model designed for both dense and MoE LLMs.
arXiv Detail & Related papers (2023-09-25T14:50:04Z) - Machine Guided Discovery of Novel Carbon Capture Solvents [48.7576911714538]
Machine learning offers a promising method for reducing the time and resource burdens of materials development.
We have developed an end-to-end "discovery cycle" to select new aqueous amines compatible with the commercially viable acid gas scrubbing carbon capture.
The prediction process shows 60% accuracy against experiment for both material parameters and 80% for a single parameter on an external test set.
arXiv Detail & Related papers (2023-03-24T18:32:38Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - PhAST: Physics-Aware, Scalable, and Task-specific GNNs for Accelerated
Catalyst Design [102.9593507372373]
Catalyst materials play a crucial role in the electrochemical reactions involved in industrial processes.
Machine learning holds the potential to efficiently model materials properties from large amounts of data.
We propose task-specific innovations applicable to most architectures, enhancing both computational efficiency and accuracy.
arXiv Detail & Related papers (2022-11-22T05:24:30Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Measuring the Carbon Intensity of AI in Cloud Instances [91.28501520271972]
We provide a framework for measuring software carbon intensity, and propose to measure operational carbon emissions.
We evaluate a suite of approaches for reducing emissions on the Microsoft Azure cloud compute platform.
arXiv Detail & Related papers (2022-06-10T17:04:04Z) - Carbon Footprint of Selecting and Training Deep Learning Models for
Medical Image Analysis [0.2936007114555107]
We focus on the carbon footprint of developing deep learning models for medical image analysis (MIA)
We present and compare the features of four tools to quantify the carbon footprint of DL.
We discuss simple strategies to cut-down the environmental impact that can make model selection and training processes more efficient.
arXiv Detail & Related papers (2022-03-04T09:22:47Z) - Carbon Emissions and Large Neural Network Training [19.233899715628073]
We calculate the energy use and carbon footprint of several recent large models-T5, Meena, GShard, Switch Transformer, and GPT-3.
We highlight the following opportunities to improve energy efficiency and CO2 equivalent emissions (CO2e)
To help reduce the carbon footprint of ML, we believe energy usage and CO2e should be a key metric in evaluating models.
arXiv Detail & Related papers (2021-04-21T04:44:25Z) - Carbontracker: Tracking and Predicting the Carbon Footprint of Training
Deep Learning Models [0.3441021278275805]
Machine learning (ML) may become a significant contributor to climate change if this exponential trend continues.
We propose that energy and carbon footprint of model development and training is reported alongside performance metrics using tools like Carbontracker.
arXiv Detail & Related papers (2020-07-06T20:24:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.