Preprocessing noisy functional data using factor models
- URL: http://arxiv.org/abs/2012.05824v1
- Date: Thu, 10 Dec 2020 16:54:44 GMT
- Title: Preprocessing noisy functional data using factor models
- Authors: Siegfried H\"ormann and Fatima Jammoul
- Abstract summary: We consider functional data which are measured on a discrete set of observation points.
Signal and noise can be naturally represented as the common and idiosyncratic component of a factor model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider functional data which are measured on a discrete set of
observation points. Often such data are measured with noise, and then the
target is to recover the underlying signal. Most commonly, practitioners use
some smoothing approach, e.g.,\ kernel smoothing or spline fitting towards this
goal. The drawback of such curve fitting techniques is that they act function
by function, and don't take into account information from the entire sample. In
this paper we argue that signal and noise can be naturally represented as the
common and idiosyncratic component, respectively, of a factor model.
Accordingly, we propose to an estimation scheme which is based on factor
models. The purpose of this paper is to explain the reasoning behind our
approach and to compare its performance on simulated and on real data to
competing methods.
Related papers
- Robust Neural Processes for Noisy Data [1.7268667700090563]
We study the behavior of in-context learning models when data is contaminated by noise.
We find that the models that perform best on clean data, are different than the models that perform best on noisy data.
We propose a simple method to train NP models that makes them more robust to noisy data.
arXiv Detail & Related papers (2024-11-03T20:00:55Z) - Efficient Prior Calibration From Indirect Data [5.588334720483076]
This paper is concerned with learning the prior model from data, in particular, learning the prior from multiple realizations of indirect data obtained through the noisy observation process.
An efficient residual-based neural operator approximation of the forward model is proposed and it is shown that this may be learned concurrently with the pushforward map.
arXiv Detail & Related papers (2024-05-28T08:34:41Z) - Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Root Cause Explanation of Outliers under Noisy Mechanisms [50.59446568076628]
Causal processes are often modelled as graphs with entities being nodes and their paths/interconnections as edge.
Existing work only consider the contribution of nodes in the generative process.
We consider both individual edge and node of each mechanism when identifying the root causes.
arXiv Detail & Related papers (2023-12-19T03:24:26Z) - Improving the Robustness of Summarization Models by Detecting and
Removing Input Noise [50.27105057899601]
We present a large empirical study quantifying the sometimes severe loss in performance from different types of input noise for a range of datasets and model sizes.
We propose a light-weight method for detecting and removing such noise in the input during model inference without requiring any training, auxiliary models, or even prior knowledge of the type of noise.
arXiv Detail & Related papers (2022-12-20T00:33:11Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Modeling Temporal Data as Continuous Functions with Stochastic Process
Diffusion [2.2849153854336763]
temporal data can be viewed as discretized measurements of the underlying function.
To build a generative model for such data we have to model the process that governs it.
We propose a solution by defining the denoising diffusion model in the function space.
arXiv Detail & Related papers (2022-11-04T17:02:01Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - Harmonization with Flow-based Causal Inference [12.739380441313022]
This paper presents a normalizing-flow-based method to perform counterfactual inference upon a structural causal model (SCM) to harmonize medical data.
We evaluate on multiple, large, real-world medical datasets to observe that this method leads to better cross-domain generalization compared to state-of-the-art algorithms.
arXiv Detail & Related papers (2021-06-12T19:57:35Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.