Causal Discovery in Dynamic Fading Wireless Networks
- URL: http://arxiv.org/abs/2506.03163v2
- Date: Mon, 10 Nov 2025 15:51:13 GMT
- Title: Causal Discovery in Dynamic Fading Wireless Networks
- Authors: Oluwaseyi Giwa,
- Abstract summary: This paper addresses causal inference challenges in dynamic fading wireless environments by proposing a sequential regression-based algorithm.<n>We derive theoretical lower and upper bounds on the detection delay required to identify structural changes, explicitly quantifying their dependence on network size, noise variance, and fading severity.<n>Our findings provide rigorous theoretical insights and practical guidelines for designing robust online causal inference mechanisms to maintain network reliability under nonstationary wireless conditions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dynamic causal discovery in wireless networks is essential due to evolving interference, fading, and mobility, which complicate traditional static causal models. This paper addresses causal inference challenges in dynamic fading wireless environments by proposing a sequential regression-based algorithm with a novel application of the NOTEARS acyclicity constraint, enabling efficient online updates. We derive theoretical lower and upper bounds on the detection delay required to identify structural changes, explicitly quantifying their dependence on network size, noise variance, and fading severity. Monte Carlo simulations validate these theoretical results, demonstrating linear increases in detection delay with network size, quadratic growth with noise variance, and inverse-square dependence on the magnitude of structural changes. Our findings provide rigorous theoretical insights and practical guidelines for designing robust online causal inference mechanisms to maintain network reliability under nonstationary wireless conditions.
Related papers
- Causal Structure Learning for Dynamical Systems with Theoretical Score Analysis [7.847876045564289]
Real world systems evolve in continuous-time according to their underlying causal relationships, yet their dynamics are often unknown.<n>We propose CaDyT, a novel method for causal discovery on dynamical systems.<n>Our experiments show that CaDyT outperforms state-of-the-art methods on both regularly and irregularly-sampled data.
arXiv Detail & Related papers (2025-12-16T12:41:22Z) - A Residual Guided strategy with Generative Adversarial Networks in training Physics-Informed Transformer Networks [8.614387766858496]
We propose a novel Residual Guided Training strategy for Physics-In Transformer via Generative Adrative Network (GAN)<n>Our framework integrates a Transformer to inherently capture temporal correlations through autoregressive processing, coupled with a residual-aware GAN.<n>Experiments on the Allen-Cahn-Gordon, and Navier-Stokes equations demonstrate significant improvements, relative relative MSE reductions of up three orders of magnitude compared to baseline methods.
arXiv Detail & Related papers (2025-07-15T03:45:42Z) - Topology-Aware Conformal Prediction for Stream Networks [54.505880918607296]
We propose Spatio-Temporal Adaptive Conformal Inference (textttCISTA), a novel framework that integrates network topology and temporal dynamics into the conformal prediction framework.<n>Our results show that textttCISTA effectively balances prediction efficiency and coverage, outperforming existing conformal prediction methods for stream networks.
arXiv Detail & Related papers (2025-03-06T21:21:15Z) - A Tunable Despeckling Neural Network Stabilized via Diffusion Equation [15.996302571895045]
Adrialversa attacks can be used as a criterion for judging the adaptability of neural networks to real data.<n>We propose a tunable, regularized neural network framework that unrolls a shallow denoising neural network block and a diffusion regularity block into a single network for end-to-end training.
arXiv Detail & Related papers (2024-11-24T17:08:43Z) - Variational Bayesian Bow tie Neural Networks with Shrinkage [0.276240219662896]
We develop a fast, approximate variational inference algorithm that avoids distributional assumptions and independence across layers.<n>We use Polya-Gamma data augmentation tricks, which render a conditionally linear and Gaussian model.
arXiv Detail & Related papers (2024-11-17T17:36:30Z) - CaTs and DAGs: Integrating Directed Acyclic Graphs with Transformers and Fully-Connected Neural Networks for Causally Constrained Predictions [6.745494093127968]
We introduce Causal Fully-Connected Neural Networks (CFCNs) and Causal Transformers (CaTs)
CFCNs andCaTs operate under predefined causal constraints, as specified by a Directed Acyclic Graph (DAG)
These models retain the powerful function approximation abilities of traditional neural networks while adhering to the underlying structural constraints.
arXiv Detail & Related papers (2024-10-18T14:10:16Z) - TDNetGen: Empowering Complex Network Resilience Prediction with Generative Augmentation of Topology and Dynamics [14.25304439234864]
We introduce a novel resilience prediction framework for complex networks, designed to tackle this issue through generative data augmentation of network topology and dynamics.
Experiment results on three network datasets demonstrate that our proposed framework TDNetGen can achieve high prediction accuracy up to 85%-95%.
arXiv Detail & Related papers (2024-08-19T09:20:31Z) - Leveraging Low-Rank and Sparse Recurrent Connectivity for Robust
Closed-Loop Control [63.310780486820796]
We show how a parameterization of recurrent connectivity influences robustness in closed-loop settings.
We find that closed-form continuous-time neural networks (CfCs) with fewer parameters can outperform their full-rank, fully-connected counterparts.
arXiv Detail & Related papers (2023-10-05T21:44:18Z) - Input correlations impede suppression of chaos and learning in balanced
rate networks [58.720142291102135]
Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity.
We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, strongly depends on correlations in the input.
arXiv Detail & Related papers (2022-01-24T19:20:49Z) - Non-Singular Adversarial Robustness of Neural Networks [58.731070632586594]
Adrial robustness has become an emerging challenge for neural network owing to its over-sensitivity to small input perturbations.
We formalize the notion of non-singular adversarial robustness for neural networks through the lens of joint perturbations to data inputs as well as model weights.
arXiv Detail & Related papers (2021-02-23T20:59:30Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.