Heteroscedastic Neural Networks for Path Loss Prediction with Link-Specific Uncertainty
- URL: http://arxiv.org/abs/2511.23243v1
- Date: Fri, 28 Nov 2025 14:52:18 GMT
- Title: Heteroscedastic Neural Networks for Path Loss Prediction with Link-Specific Uncertainty
- Authors: Jonathan Ethier,
- Abstract summary: We propose a neural network that jointly predicts the mean and link-specific variance.<n>These uncertainty estimates further support link-specific coverage margins, improve RF planning and interference analyses, and provide effective self-diagnostics of model weaknesses.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Traditional and modern machine learning-based path loss models typically assume a constant prediction variance. We propose a neural network that jointly predicts the mean and link-specific variance by minimizing a Gaussian negative log-likelihood, enabling heteroscedastic uncertainty estimates. We compare shared, partially shared, and independent-parameter architectures using accuracy, calibration, and sharpness metrics on blind test sets from large public RF drive-test datasets. The shared-parameter architecture performs best, achieving an RMSE of 7.4 dB, 95.1 percent coverage for 95 percent prediction intervals, and a mean interval width of 29.6 dB. These uncertainty estimates further support link-specific coverage margins, improve RF planning and interference analyses, and provide effective self-diagnostics of model weaknesses.
Related papers
- Conformal Prediction for Multi-Source Detection on a Network [59.17729745907474]
We study the multi-source detection problem.<n>Given snapshot observations of node infection status on a graph, estimate the set of source nodes that initiated the propagation.<n>We propose a novel conformal prediction framework that provides statistically valid recall guarantees for source set detection.
arXiv Detail & Related papers (2025-11-12T01:09:56Z) - Uncertainty Awareness on Unsupervised Domain Adaptation for Time Series Data [49.36938105983916]
Unsupervised domain adaptation methods seek to generalize effectively on unlabeled test data.<n>We propose incorporating multi-scale feature extraction and uncertainty estimation to improve the model's generalization and robustness across domains.
arXiv Detail & Related papers (2025-08-26T03:13:08Z) - An analysis of the noise schedule for score-based generative models [7.180235086275926]
Score-based generative models (SGMs) aim at estimating a target data distribution by learning score functions using only noise-perturbed samples from the target.<n>Recent literature has focused extensively on assessing the error between the target and estimated distributions, gauging the generative quality through the Kullback-Leibler (KL) divergence and Wasserstein distances.<n>We establish an upper bound for the KL divergence between the target and the estimated distributions, explicitly depending on any time-dependent noise schedule.
arXiv Detail & Related papers (2024-02-07T08:24:35Z) - An AI-enabled Bias-Free Respiratory Disease Diagnosis Model using Cough
Audio: A Case Study for COVID-19 [1.1146119513912156]
We propose the Bias Free Network (RBFNet) to mitigate the impact of confounders in the training data distribution.
RBFNet ensures accurate and unbiased RD diagnosis features, emphasizing its relevance by incorporating a COVID19 dataset.
An additional bias predictor is incorporated in the classification scheme to formulate a conditional Generative Adrial Network (cGAN)
arXiv Detail & Related papers (2024-01-04T13:09:45Z) - Uncertainty Quantification over Graph with Conformalized Graph Neural
Networks [52.20904874696597]
Graph Neural Networks (GNNs) are powerful machine learning prediction models on graph-structured data.
GNNs lack rigorous uncertainty estimates, limiting their reliable deployment in settings where the cost of errors is significant.
We propose conformalized GNN (CF-GNN), extending conformal prediction (CP) to graph-based models for guaranteed uncertainty estimates.
arXiv Detail & Related papers (2023-05-23T21:38:23Z) - Jensen-Shannon Divergence Based Novel Loss Functions for Bayesian Neural Networks [2.4554686192257424]
We formulate a novel loss function for BNNs based on a new modification to the generalized Jensen-Shannon (JS) divergence, which is bounded.<n>We find that the JS divergence-based variational inference is intractable, and hence employed a constrained optimization framework to formulate these losses.<n>Our theoretical analysis and empirical experiments on multiple regression and classification data sets suggest that the proposed losses perform better than the KL divergence-based loss, especially when the data sets are noisy or biased.
arXiv Detail & Related papers (2022-09-23T01:47:09Z) - Accurate Prediction and Uncertainty Estimation using Decoupled
Prediction Interval Networks [0.0]
We propose a network architecture capable of reliably estimating uncertainty of regression based predictions without sacrificing accuracy.
We achieve this by breaking down the learning of prediction and prediction interval (PI) estimations into a two-stage training process.
We compare the proposed method with current state-of-the-art uncertainty quantification algorithms on synthetic datasets and UCI benchmarks, reducing the error in the predictions by 23 to 34%.
arXiv Detail & Related papers (2022-02-19T19:31:36Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.