Estimating Latent Demand of Shared Mobility through Censored Gaussian
Processes
- URL: http://arxiv.org/abs/2001.07402v2
- Date: Mon, 17 Feb 2020 12:09:18 GMT
- Title: Estimating Latent Demand of Shared Mobility through Censored Gaussian
Processes
- Authors: Daniele Gammelli, Inon Peled, Filipe Rodrigues, Dario Pacino, Haci A.
Kurtaran, Francisco C. Pereira
- Abstract summary: Transport demand is highly dependent on supply, especially for shared transport services where availability is often limited.
As observed demand cannot be higher than available supply, historical transport data typically represents a biased, or censored, version of the true underlying demand pattern.
We propose a general method for censorship-aware demand modeling, for which we devise a censored likelihood function.
- Score: 11.695095006311176
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transport demand is highly dependent on supply, especially for shared
transport services where availability is often limited. As observed demand
cannot be higher than available supply, historical transport data typically
represents a biased, or censored, version of the true underlying demand
pattern. Without explicitly accounting for this inherent distinction,
predictive models of demand would necessarily represent a biased version of
true demand, thus less effectively predicting the needs of service users. To
counter this problem, we propose a general method for censorship-aware demand
modeling, for which we devise a censored likelihood function. We apply this
method to the task of shared mobility demand prediction by incorporating the
censored likelihood within a Gaussian Process model, which can flexibly
approximate arbitrary functional forms. Experiments on artificial and
real-world datasets show how taking into account the limiting effect of supply
on demand is essential in the process of obtaining an unbiased predictive model
of user demand behavior.
Related papers
- F-FOMAML: GNN-Enhanced Meta-Learning for Peak Period Demand Forecasting with Proxy Data [65.6499834212641]
We formulate the demand prediction as a meta-learning problem and develop the Feature-based First-Order Model-Agnostic Meta-Learning (F-FOMAML) algorithm.
By considering domain similarities through task-specific metadata, our model improved generalization, where the excess risk decreases as the number of training tasks increases.
Compared to existing state-of-the-art models, our method demonstrates a notable improvement in demand prediction accuracy, reducing the Mean Absolute Error by 26.24% on an internal vending machine dataset and by 1.04% on the publicly accessible JD.com dataset.
arXiv Detail & Related papers (2024-06-23T21:28:50Z) - Low-rank finetuning for LLMs: A fairness perspective [54.13240282850982]
Low-rank approximation techniques have become the de facto standard for fine-tuning Large Language Models.
This paper investigates the effectiveness of these methods in capturing the shift of fine-tuning datasets from the initial pre-trained data distribution.
We show that low-rank fine-tuning inadvertently preserves undesirable biases and toxic behaviors.
arXiv Detail & Related papers (2024-05-28T20:43:53Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Preference Enhanced Social Influence Modeling for Network-Aware Cascade
Prediction [59.221668173521884]
We propose a novel framework to promote cascade size prediction by enhancing the user preference modeling.
Our end-to-end method makes the user activating process of information diffusion more adaptive and accurate.
arXiv Detail & Related papers (2022-04-18T09:25:06Z) - End-to-End Demand Response Model Identification and Baseline Estimation
with Deep Learning [3.553493344868414]
This paper proposes a novel end-to-end deep learning framework that simultaneously identifies demand baselines and the incentive-based agent demand response model.
We demonstrate the effectiveness of our approach through computation experiments with synthetic demand response traces and a large-scale real world demand response dataset.
arXiv Detail & Related papers (2021-09-02T06:43:37Z) - Causally-motivated Shortcut Removal Using Auxiliary Labels [63.686580185674195]
Key challenge to learning such risk-invariant predictors is shortcut learning.
We propose a flexible, causally-motivated approach to address this challenge.
We show both theoretically and empirically that this causally-motivated regularization scheme yields robust predictors.
arXiv Detail & Related papers (2021-05-13T16:58:45Z) - Modeling Censored Mobility Demand through Quantile Regression Neural
Networks [21.528321119061694]
We show that CQRNN can estimate the intended distributions better than both censorship-unaware models and parametric censored models.
Results show that CQRNN can estimate the intended distributions better than both censorship-unaware models and parametric censored models.
arXiv Detail & Related papers (2021-04-02T19:24:15Z) - Reframing demand forecasting: a two-fold approach for lumpy and
intermittent demand [0.9137554315375922]
We show that competitive demand forecasts can be obtained through two models: predicting the demand occurrence and estimating the demand size.
Our research shows that global classification models are the best choice when predicting demand event occurrence.
We tested our approach on real-world data consisting of 516 three-year-long time series corresponding to European automotive original equipment manufacturers' daily demand.
arXiv Detail & Related papers (2021-03-23T17:57:40Z) - Contextual Dropout: An Efficient Sample-Dependent Dropout Module [60.63525456640462]
Dropout has been demonstrated as a simple and effective module to regularize the training process of deep neural networks.
We propose contextual dropout with an efficient structural design as a simple and scalable sample-dependent dropout module.
Our experimental results show that the proposed method outperforms baseline methods in terms of both accuracy and quality of uncertainty estimation.
arXiv Detail & Related papers (2021-03-06T19:30:32Z) - Generalized Multi-Output Gaussian Process Censored Regression [7.111443975103331]
We introduce a heteroscedastic multi-output Gaussian process model which combines the non-parametric flexibility of GPs with the ability to leverage information from correlated outputs under input-dependent noise conditions.
Results show how the added flexibility allows our model to better estimate the underlying non-censored (i.e. true) process under potentially complex censoring dynamics.
arXiv Detail & Related papers (2020-09-10T12:46:29Z) - Uncertainty Quantification for Demand Prediction in Contextual Dynamic
Pricing [20.828160401904697]
We study the problem of constructing accurate confidence intervals for the demand function.
We develop a debiased approach and provide the normality guarantee of the debiased estimator.
arXiv Detail & Related papers (2020-03-16T04:21:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.