Predicting Cascade Failures in Interdependent Urban Infrastructure Networks
- URL: http://arxiv.org/abs/2503.02890v1
- Date: Wed, 26 Feb 2025 14:50:22 GMT
- Title: Predicting Cascade Failures in Interdependent Urban Infrastructure Networks
- Authors: Yinzhou Tang, Jinghua Piao, Huandong Wang, Shaw Rajib, Yong Li,
- Abstract summary: Cascading failures (CF) entail component breakdowns spreading through infrastructure networks, causing system-wide collapse.<n>We introduce the textbfIntegrated textbfInterdependent textbfInfrastructure CF model ($I3$), designed to capture CF dynamics both within and across infrastructures.<n>$I3$ achieves a 31.94% in terms of AUC, 18.03% in terms of Precision, 29.17% in terms of Recall, 22.73% in terms of F1-score boost in predicting infrastructure
- Score: 10.59074382276026
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cascading failures (CF) entail component breakdowns spreading through infrastructure networks, causing system-wide collapse. Predicting CFs is of great importance for infrastructure stability and urban function. Despite extensive research on CFs in single networks such as electricity and road networks, interdependencies among diverse infrastructures remain overlooked, and capturing intra-infrastructure CF dynamics amid complex evolutions poses challenges. To address these gaps, we introduce the \textbf{I}ntegrated \textbf{I}nterdependent \textbf{I}nfrastructure CF model ($I^3$), designed to capture CF dynamics both within and across infrastructures. $I^3$ employs a dual GAE with global pooling for intra-infrastructure dynamics and a heterogeneous graph for inter-infrastructure interactions. An initial node enhancement pre-training strategy mitigates GCN-induced over-smoothing. Experiments demonstrate $I^3$ achieves a 31.94\% in terms of AUC, 18.03\% in terms of Precision, 29.17\% in terms of Recall, 22.73\% in terms of F1-score boost in predicting infrastructure failures, and a 28.52\% reduction in terms of RMSE for cascade volume forecasts compared to leading models. It accurately pinpoints phase transitions in interconnected and singular networks, rectifying biases in models tailored for singular networks. Access the code at https://github.com/tsinghua-fib-lab/Icube.
Related papers
- Informed Greedy Algorithm for Scalable Bayesian Network Fusion via Minimum Cut Analysis [1.7086867242274812]
This paper presents the Greedy Min-Cut Bayesian Consensus (GMCBC) algorithm for the structural fusion of Bayesian Networks (BNs)
The method is designed to preserve essential dependencies while controlling network complexity.
It addresses the limitations of traditional fusion approaches, which often lead to excessively complex models.
arXiv Detail & Related papers (2025-04-01T06:47:33Z) - Structure-prior Informed Diffusion Model for Graph Source Localization with Limited Data [13.443269048443627]
This paper introduces SIDSL, a novel framework that addresses three key challenges in limited-data scenarios.<n> SIDSL incorporates topology-aware priors through graph label propagation and employs a propagation-enhanced conditional denoiser.<n> Experimental results across four real-world datasets demonstrate SIDSL's superior performance.
arXiv Detail & Related papers (2025-02-25T07:47:22Z) - Imbalance-Aware Culvert-Sewer Defect Segmentation Using an Enhanced Feature Pyramid Network [1.7466076090043157]
This paper introduces a deep learning model for the semantic segmentation of culverts and sewer pipes within imbalanced datasets.
The model employs strategies like class decomposition and data augmentation to address dataset imbalance.
Experimental results on the culvert-sewer defects dataset and a benchmark aerial semantic segmentation drone dataset show that the E-FPN outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-08-19T17:40:18Z) - Isomorphic Pruning for Vision Models [56.286064975443026]
Structured pruning reduces the computational overhead of deep neural networks by removing redundant sub-structures.
We present Isomorphic Pruning, a simple approach that demonstrates effectiveness across a range of network architectures.
arXiv Detail & Related papers (2024-07-05T16:14:53Z) - Chasing Fairness in Graphs: A GNN Architecture Perspective [73.43111851492593]
We propose textsfFair textsfMessage textsfPassing (FMP) designed within a unified optimization framework for graph neural networks (GNNs)
In FMP, the aggregation is first adopted to utilize neighbors' information and then the bias mitigation step explicitly pushes demographic group node presentation centers together.
Experiments on node classification tasks demonstrate that the proposed FMP outperforms several baselines in terms of fairness and accuracy on three real-world datasets.
arXiv Detail & Related papers (2023-12-19T18:00:15Z) - IIVA: A Simulation Based Generalized Framework for Interdependent
Infrastructure Vulnerability Assessment [0.0]
This paper proposes a novel infrastructure vulnerability assessment framework that accounts for: various types of infrastructure interdependencies.
It is observed that higher the initial failure rate of the components, higher is the vulnerability of the infrastructure.
arXiv Detail & Related papers (2022-12-13T20:37:03Z) - A Bayesian Approach to Reconstructing Interdependent Infrastructure
Networks from Cascading Failures [2.9364290037516496]
Understanding network interdependencies is crucial to anticipate cascading failures and plan for disruptions.
Data on the topology of individual networks are often publicly unavailable due to privacy and security concerns.
We propose a scalable nonparametric Bayesian approach to reconstruct the topology of interdependent infrastructure networks.
arXiv Detail & Related papers (2022-11-28T17:45:41Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - IC Networks: Remodeling the Basic Unit for Convolutional Neural Networks [8.218732270970381]
"Inter-layer Collision" (IC) structure can be integrated into existing CNNs to improve their performance.
New training method, namely weak logit distillation (WLD), is proposed to speed up the training of IC networks.
In the ImageNet experiment, we integrate the IC structure into ResNet-50 and reduce the top-1 error from 22.38% to 21.75%.
arXiv Detail & Related papers (2021-02-06T03:15:43Z) - ACDC: Weight Sharing in Atom-Coefficient Decomposed Convolution [57.635467829558664]
We introduce a structural regularization across convolutional kernels in a CNN.
We show that CNNs now maintain performance with dramatic reduction in parameters and computations.
arXiv Detail & Related papers (2020-09-04T20:41:47Z) - Structured Convolutions for Efficient Neural Network Design [65.36569572213027]
We tackle model efficiency by exploiting redundancy in the textitimplicit structure of the building blocks of convolutional neural networks.
We show how this decomposition can be applied to 2D and 3D kernels as well as the fully-connected layers.
arXiv Detail & Related papers (2020-08-06T04:38:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.