Streaming Gaussian Dirichlet Random Fields for Spatial Predictions of
High Dimensional Categorical Observations
- URL: http://arxiv.org/abs/2402.15359v1
- Date: Fri, 23 Feb 2024 14:52:05 GMT
- Title: Streaming Gaussian Dirichlet Random Fields for Spatial Predictions of
High Dimensional Categorical Observations
- Authors: J. E. San Soucie, H. M. Sosik, Y. Girdhar
- Abstract summary: We present a novel approach to a stream oftemporally distributed, sparse, high-dimensional categorical observations.
The proposed approach efficiently learns global local patterns in Streaming data.
We demonstrate the ability of a network approach to make more accurate predictions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present the Streaming Gaussian Dirichlet Random Field (S-GDRF) model, a
novel approach for modeling a stream of spatiotemporally distributed, sparse,
high-dimensional categorical observations. The proposed approach efficiently
learns global and local patterns in spatiotemporal data, allowing for fast
inference and querying with a bounded time complexity. Using a high-resolution
data series of plankton images classified with a neural network, we demonstrate
the ability of the approach to make more accurate predictions compared to a
Variational Gaussian Process (VGP), and to learn a predictive distribution of
observations from streaming categorical data. S-GDRFs open the door to enabling
efficient informative path planning over high-dimensional categorical
observations, which until now has not been feasible.
Related papers
- Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Probabilistic forecasting for geosteering in fluvial successions using a
generative adversarial network [0.0]
Fast updates based on real-time data are essential when drilling in complex reservoirs with high uncertainties in pre-drill models.
We propose a generative adversarial deep neural network (GAN) trained to reproduce geologically consistent 2D sections of fluvial successions.
In our example, the method reduces uncertainty and correctly predicts most major geological features up to 500 meters ahead of drill-bit.
arXiv Detail & Related papers (2022-07-04T12:52:38Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Probabilistic Time Series Forecasting with Implicit Quantile Networks [0.7249731529275341]
We combine an autoregressive recurrent neural network to model temporal dynamics with Implicit Quantile Networks to learn a large class of distributions over a time-series target.
Our approach is favorable in terms of point-wise prediction accuracy as well as on estimating the underlying temporal distribution.
arXiv Detail & Related papers (2021-07-08T10:37:24Z) - Sparse Algorithms for Markovian Gaussian Processes [18.999495374836584]
Sparse Markovian processes combine the use of inducing variables with efficient Kalman filter-likes recursion.
We derive a general site-based approach to approximate the non-Gaussian likelihood with local Gaussian terms, called sites.
Our approach results in a suite of novel sparse extensions to algorithms from both the machine learning and signal processing, including variational inference, expectation propagation, and the classical nonlinear Kalman smoothers.
The derived methods are suited to literature-temporal data, where the model has separate inducing points in both time and space.
arXiv Detail & Related papers (2021-03-19T09:50:53Z) - Remaining Useful Life Estimation Under Uncertainty with Causal GraphNets [0.0]
A novel approach for the construction and training of time series models is presented.
The proposed method is appropriate for constructing predictive models for non-stationary time series.
arXiv Detail & Related papers (2020-11-23T21:28:03Z) - Bayesian Graph Neural Networks with Adaptive Connection Sampling [62.51689735630133]
We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs)
The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs.
arXiv Detail & Related papers (2020-06-07T07:06:35Z) - Gaussian-Dirichlet Random Fields for Inference over High Dimensional
Categorical Observations [3.383942690870476]
We propose a generative model for the distribution of high dimensional categorical observations produced by robots.
The proposed approach combines the use of Dirichlet distributions to sparse co-occurrence relations between the observed categories.
arXiv Detail & Related papers (2020-03-26T19:29:23Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.