Learning Cellular Network Connection Quality with Conformal
- URL: http://arxiv.org/abs/2407.10976v1
- Date: Tue, 4 Jun 2024 19:33:00 GMT
- Title: Learning Cellular Network Connection Quality with Conformal
- Authors: Hanyang Jiang, Elizabeth Belding, Ellen Zegure, Yao Xie,
- Abstract summary: We introduce a novel conformal prediction technique to build an uncertainty map.
We identify regions with heightened uncertainty to prioritize targeted, manual data collection.
Our method also leads to a sampling strategy that guides researchers to selectively gather high-quality data.
- Score: 5.695027198038298
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we address the problem of uncertainty quantification for cellular network speed. It is a well-known fact that the actual internet speed experienced by a mobile phone can fluctuate significantly, even when remaining in a single location. This high degree of variability underscores that mere point estimation of network speed is insufficient. Rather, it is advantageous to establish a prediction interval that can encompass the expected range of speed variations. In order to build an accurate network estimation map, numerous mobile data need to be collected at different locations. Currently, public datasets rely on users to upload data through apps. Although massive data has been collected, the datasets suffer from significant noise due to the nature of cellular networks and various other factors. Additionally, the uneven distribution of population density affects the spatial consistency of data collection, leading to substantial uncertainty in the network quality maps derived from this data. We focus our analysis on large-scale internet-quality datasets provided by Ookla to construct an estimated map of connection quality. To improve the reliability of this map, we introduce a novel conformal prediction technique to build an uncertainty map. We identify regions with heightened uncertainty to prioritize targeted, manual data collection. In addition, the uncertainty map quantifies how reliable the prediction is in different areas. Our method also leads to a sampling strategy that guides researchers to selectively gather high-quality data that best complement the current dataset to improve the overall accuracy of the prediction model.
Related papers
- Enhancing stop location detection for incomplete urban mobility datasets [0.0]
This study investigates the application of classification algorithms to enhance density-based methods for stop identification.
Our approach incorporates multiple features, including individual routine behavior across various time and scales local characteristics of individual GPS points.
arXiv Detail & Related papers (2024-07-16T10:41:08Z) - Diffusion-based Data Augmentation for Object Counting Problems [62.63346162144445]
We develop a pipeline that utilizes a diffusion model to generate extensive training data.
We are the first to generate images conditioned on a location dot map with a diffusion model.
Our proposed counting loss for the diffusion model effectively minimizes the discrepancies between the location dot map and the crowd images generated.
arXiv Detail & Related papers (2024-01-25T07:28:22Z) - Mobile Internet Quality Estimation using Self-Tuning Kernel Regression [7.6449549886709764]
We look into estimating mobile (cellular) internet quality at the scale of a state in the United States.
Most of the samples are concentrated in limited areas, while very few are available in the rest.
We propose a new adaptive kernel regression approach that employs self-tuning kernels to alleviate the adverse effects of data imbalance.
arXiv Detail & Related papers (2023-11-04T21:09:46Z) - Distil the informative essence of loop detector data set: Is
network-level traffic forecasting hungry for more data? [0.8002196839441036]
We propose an uncertainty-aware traffic forecasting framework to explore how many samples of loop data are truly effective for training forecasting models.
The proposed methodology proves valuable in evaluating large traffic datasets' true information content.
arXiv Detail & Related papers (2023-10-31T11:23:10Z) - Bayesian Neural Network Versus Ex-Post Calibration For Prediction
Uncertainty [0.2343856409260935]
Probabilistic predictions from neural networks account for predictive uncertainty during classification.
In practice most datasets are trained on non-probabilistic neural networks which by default do not capture this inherent uncertainty.
A plausible alternative to the calibration approach is to use Bayesian neural networks, which directly models a predictive distribution.
arXiv Detail & Related papers (2022-09-29T07:22:19Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Uncertainty Estimation and Sample Selection for Crowd Counting [87.29137075538213]
We present a method for image-based crowd counting that can predict a crowd density map together with the uncertainty values pertaining to the predicted density map.
A key advantage of our method over existing crowd counting methods is its ability to quantify the uncertainty of its predictions.
We show that our sample selection strategy drastically reduces the amount of labeled data needed to adapt a counting network trained on a source domain to the target domain.
arXiv Detail & Related papers (2020-09-30T03:40:07Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - JHU-CROWD++: Large-Scale Crowd Counting Dataset and A Benchmark Method [92.15895515035795]
We introduce a new large scale unconstrained crowd counting dataset (JHU-CROWD++) that contains "4,372" images with "1.51 million" annotations.
We propose a novel crowd counting network that progressively generates crowd density maps via residual error estimation.
arXiv Detail & Related papers (2020-04-07T14:59:35Z) - Diversity inducing Information Bottleneck in Model Ensembles [73.80615604822435]
In this paper, we target the problem of generating effective ensembles of neural networks by encouraging diversity in prediction.
We explicitly optimize a diversity inducing adversarial loss for learning latent variables and thereby obtain diversity in the output predictions necessary for modeling multi-modal data.
Compared to the most competitive baselines, we show significant improvements in classification accuracy, under a shift in the data distribution.
arXiv Detail & Related papers (2020-03-10T03:10:41Z) - Better coverage, better outcomes? Mapping mobile network data to
official statistics using satellite imagery and radio propagation modelling [0.0]
I use human settlement information extracted from publicly available satellite imagery in combination with radio propagation modelling techniques to account for that.
I investigate in a simulation study and a real-world application on unemployment estimates in Senegal whether better coverage approximations lead to better outcome predictions.
arXiv Detail & Related papers (2020-02-20T14:19:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.