Smart IoT-Based Leak Forecasting and Detection for Energy-Efficient Liquid Cooling in AI Data Centers
- URL: http://arxiv.org/abs/2512.21801v1
- Date: Thu, 25 Dec 2025 22:51:16 GMT
- Title: Smart IoT-Based Leak Forecasting and Detection for Energy-Efficient Liquid Cooling in AI Data Centers
- Authors: Krishna Chaitanya Sunkara, Rambabu Konakanchi,
- Abstract summary: We present a proof-of-concept smart IoT monitoring system combining LSTM neural networks for probabilistic leak forecasting.<n>For a typical 47-rack facility, this approach could prevent roughly 1,500 annual energy waste.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: AI data centers which are GPU centric, have adopted liquid cooling to handle extreme heat loads, but coolant leaks result in substantial energy loss through unplanned shutdowns and extended repair periods. We present a proof-of-concept smart IoT monitoring system combining LSTM neural networks for probabilistic leak forecasting with Random Forest classifiers for instant detection. Testing on synthetic data aligned with ASHRAE 2021 standards, our approach achieves 96.5% detection accuracy and 87% forecasting accuracy at 90% probability within plus or minus 30-minute windows. Analysis demonstrates that humidity, pressure, and flow rate deliver strong predictive signals, while temperature exhibits minimal immediate response due to thermal inertia in server hardware. The system employs MQTT streaming, InfluxDB storage, and Streamlit dashboards, forecasting leaks 2-4 hours ahead while identifying sudden events within 1 minute. For a typical 47-rack facility, this approach could prevent roughly 1,500 kWh annual energy waste through proactive maintenance rather than reactive emergency procedures. While validation remains synthetic-only, results establish feasibility for future operational deployment in sustainable data center operations.
Related papers
- 2D-ThermAl: Physics-Informed Framework for Thermal Analysis of Circuits using Generative AI [9.414651358362388]
'ThermAl' is a physics-informed generative AI framework which effectively identifies heat sources and estimates full-chip transient and steady-state thermal distributions.<n>Our model is trained on an extensive dataset of heat dissipation maps, ranging from simple logic gates (e.g., inverters, NAND, XOR) to complex designs, generated via COMSOL.<n> Experimental results demonstrate that ThermAl delivers precise temperature mappings for large circuits, with a root mean squared error (RMSE) of only 0.71C, and outperforms conventional FEM tools by running up to 200 times faster.
arXiv Detail & Related papers (2025-12-01T00:45:26Z) - EventFlow: Real-Time Neuromorphic Event-Driven Classification of Two-Phase Boiling Flow Regimes [1.242656043400274]
Flow boiling is an efficient heat transfer mechanism capable of dissipating high heat loads with minimal temperature variation.<n>Sudden shifts between flow regimes can disrupt thermal performance and system reliability.<n>We propose a real-time framework based on signals from neuromorphic sensors for flow regime classification.
arXiv Detail & Related papers (2025-11-07T18:13:46Z) - A Lightweight DL Model for Smart Grid Power Forecasting with Feature and Resolution Mismatch [0.4999814847776097]
This paper challenges teams to predict-day power demand using real-world high-frequency data.<n>We propose a robust yet lightweight Deep Learning pipeline combining hourly downsizing, dual-mode imputation, and comprehensive normalization.<n>A sequence-to-one model achieves an average RMSE of 601.9W, MAE of 468.9W, and 84.36% accuracy.
arXiv Detail & Related papers (2025-10-19T16:12:53Z) - Automated Energy-Aware Time-Series Model Deployment on Embedded FPGAs for Resilient Combined Sewer Overflow Management [17.903318666906728]
Extreme weather events, intensified by climate change, increasingly challenge aging combined sewer systems.<n>Forecasting of sewer overflow basin filling levels can provide actionable insights for early intervention.<n>We propose an end-to-end forecasting framework that enables energy-efficient inference directly on edge devices.
arXiv Detail & Related papers (2025-08-19T15:06:04Z) - AI-Powered Dynamic Fault Detection and Performance Assessment in Photovoltaic Systems [44.99833362998488]
intermittent nature of photovoltaic (PV) solar energy leads to power losses of 10-70% and an average energy production decrease of 25%.
Current fault detection strategies are costly and often yield unreliable results due to complex data signal profiles.
This research presents a computational model using the PVlib library in Python, incorporating a dynamic loss quantification algorithm.
arXiv Detail & Related papers (2024-08-19T23:52:06Z) - Long-term drought prediction using deep neural networks based on geospatial weather data [75.38539438000072]
High-quality drought forecasting up to a year in advance is critical for agriculture planning and insurance.
We tackle drought data by introducing an end-to-end approach that adopts a systematic end-to-end approach.
Key findings are the exceptional performance of a Transformer model, EarthFormer, in making accurate short-term (up to six months) forecasts.
arXiv Detail & Related papers (2023-09-12T13:28:06Z) - Evaluating Short-Term Forecasting of Multiple Time Series in IoT
Environments [67.24598072875744]
Internet of Things (IoT) environments are monitored via a large number of IoT enabled sensing devices.
To alleviate this issue, sensors are often configured to operate at relatively low sampling frequencies.
This can hamper dramatically subsequent decision-making, such as forecasting.
arXiv Detail & Related papers (2022-06-15T19:46:59Z) - SOUL: An Energy-Efficient Unsupervised Online Learning Seizure Detection
Classifier [68.8204255655161]
Implantable devices that record neural activity and detect seizures have been adopted to issue warnings or trigger neurostimulation to suppress seizures.
For an implantable seizure detection system, a low power, at-the-edge, online learning algorithm can be employed to dynamically adapt to neural signal drifts.
SOUL was fabricated in TSMC's 28 nm process occupying 0.1 mm2 and achieves 1.5 nJ/classification energy efficiency, which is at least 24x more efficient than state-of-the-art.
arXiv Detail & Related papers (2021-10-01T23:01:20Z) - Artificial Intelligence based Sensor Data Analytics Framework for Remote
Electricity Network Condition Monitoring [0.0]
Rural electrification demands the use of inexpensive technologies such as single wire earth return (SWER) networks.
There is a steadily growing energy demand from remote consumers, and the capacity of existing lines may become inadequate soon.
High impedance arcing faults (HIF) from SWER lines can cause catastrophic bushfires such as the 2009 Black Saturday event.
arXiv Detail & Related papers (2021-01-21T07:50:01Z) - Deep Anomaly Detection for Time-series Data in Industrial IoT: A
Communication-Efficient On-device Federated Learning Approach [40.992167455141946]
This paper proposes a new communication-efficient on-device federated learning (FL)-based deep anomaly detection framework for sensing time-series data in IIoT.
We first introduce a FL framework to enable decentralized edge devices to collaboratively train an anomaly detection model, which can improve its generalization ability.
Second, we propose an Attention Mechanism-based Convolutional Neural Network-Long Short Term Memory (AMCNN-LSTM) model to accurately detect anomalies.
Third, to adapt the proposed framework to the timeliness of industrial anomaly detection, we propose a gradient compression mechanism based on Top-textitk selection to
arXiv Detail & Related papers (2020-07-19T16:47:26Z) - Adaptive Anomaly Detection for IoT Data in Hierarchical Edge Computing [71.86955275376604]
We propose an adaptive anomaly detection approach for hierarchical edge computing (HEC) systems to solve this problem.
We design an adaptive scheme to select one of the models based on the contextual information extracted from input data, to perform anomaly detection.
We evaluate our proposed approach using a real IoT dataset, and demonstrate that it reduces detection delay by 84% while maintaining almost the same accuracy as compared to offloading detection tasks to the cloud.
arXiv Detail & Related papers (2020-01-10T05:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.