Improving Efficiency of Iso-Surface Extraction on Implicit Neural
Representations Using Uncertainty Propagation
- URL: http://arxiv.org/abs/2402.13861v1
- Date: Wed, 21 Feb 2024 15:10:20 GMT
- Title: Improving Efficiency of Iso-Surface Extraction on Implicit Neural
Representations Using Uncertainty Propagation
- Authors: Haoyu Li and Han-Wei Shen
- Abstract summary: Implicit Neural representations (INRs) are widely used for scientific data reduction and visualization.
Range analysis has shown promising results in improving the efficiency of geometric queries on INRs for 3D geometries.
We present an improved technique for range analysis by revisiting the arithmetic rules and analyzing the probability distribution of the network output within a spatial region.
- Score: 32.329370320329005
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Implicit Neural representations (INRs) are widely used for scientific data
reduction and visualization by modeling the function that maps a spatial
location to a data value. Without any prior knowledge about the spatial
distribution of values, we are forced to sample densely from INRs to perform
visualization tasks like iso-surface extraction which can be very
computationally expensive. Recently, range analysis has shown promising results
in improving the efficiency of geometric queries, such as ray casting and
hierarchical mesh extraction, on INRs for 3D geometries by using arithmetic
rules to bound the output range of the network within a spatial region.
However, the analysis bounds are often too conservative for complex scientific
data. In this paper, we present an improved technique for range analysis by
revisiting the arithmetic rules and analyzing the probability distribution of
the network output within a spatial region. We model this distribution
efficiently as a Gaussian distribution by applying the central limit theorem.
Excluding low probability values, we are able to tighten the output bounds,
resulting in a more accurate estimation of the value range, and hence more
accurate identification of iso-surface cells and more efficient iso-surface
extraction on INRs. Our approach demonstrates superior performance in terms of
the iso-surface extraction time on four datasets compared to the original range
analysis method and can also be generalized to other geometric query tasks.
Related papers
- Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Efficient Large-scale Nonstationary Spatial Covariance Function
Estimation Using Convolutional Neural Networks [3.5455896230714194]
We use ConvNets to derive subregions from the nonstationary data.
We employ a selection mechanism to identify subregions that exhibit similar behavior to stationary fields.
We assess the performance of the proposed method with synthetic and real datasets at a large scale.
arXiv Detail & Related papers (2023-06-20T12:17:46Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Super-resolution GANs of randomly-seeded fields [68.8204255655161]
We propose a novel super-resolution generative adversarial network (GAN) framework to estimate field quantities from random sparse sensors.
The algorithm exploits random sampling to provide incomplete views of the high-resolution underlying distributions.
The proposed technique is tested on synthetic databases of fluid flow simulations, ocean surface temperature distributions measurements, and particle image velocimetry data.
arXiv Detail & Related papers (2022-02-23T18:57:53Z) - Spelunking the Deep: Guaranteed Queries for General Neural Implicit
Surfaces [35.438964954948574]
This work presents a new approach to perform queries directly on general neural implicit functions for a wide range of existing architectures.
Our key tool is the application of range analysis to neural networks, using automatic arithmetic rules to bound the output of a network over a region.
We use the resulting bounds to develop queries including ray casting, intersection testing, constructing spatial hierarchies, fast mesh extraction, closest-point evaluation.
arXiv Detail & Related papers (2022-02-05T00:37:08Z) - Probabilistic partition of unity networks: clustering based deep
approximation [0.0]
Partition of unity networks (POU-Nets) have been shown capable of realizing algebraic convergence rates for regression and solution of PDEs.
We enrich POU-Nets with a Gaussian noise model to obtain a probabilistic generalization amenable to gradient-based generalizations of a maximum likelihood loss.
We provide benchmarks quantifying performance in high/low-dimensions, demonstrating that convergence rates depend only on the latent dimension of data within high-dimensional space.
arXiv Detail & Related papers (2021-07-07T08:02:00Z) - Local approximate Gaussian process regression for data-driven
constitutive laws: Development and comparison with neural networks [0.0]
We show how to use local approximate process regression to predict stress outputs at particular strain space locations.
A modified Newton-Raphson approach is proposed to accommodate for the local nature of the laGPR approximation when solving the global structural problem in a FE setting.
arXiv Detail & Related papers (2021-05-07T14:49:28Z) - GENs: Generative Encoding Networks [4.269725092203672]
We propose and analyze the use of nonparametric density methods to estimate the Jensen-Shannon divergence for matching unknown data distributions to known target distributions.
This analytical method has several advantages: better behavior when training sample quantity is low, provable convergence properties, and relatively few parameters, which can be derived analytically.
arXiv Detail & Related papers (2020-10-28T23:40:03Z) - Augmented Sliced Wasserstein Distances [55.028065567756066]
We propose a new family of distance metrics, called augmented sliced Wasserstein distances (ASWDs)
ASWDs are constructed by first mapping samples to higher-dimensional hypersurfaces parameterized by neural networks.
Numerical results demonstrate that the ASWD significantly outperforms other Wasserstein variants for both synthetic and real-world problems.
arXiv Detail & Related papers (2020-06-15T23:00:08Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Spatial-Spectral Residual Network for Hyperspectral Image
Super-Resolution [82.1739023587565]
We propose a novel spectral-spatial residual network for hyperspectral image super-resolution (SSRNet)
Our method can effectively explore spatial-spectral information by using 3D convolution instead of 2D convolution, which enables the network to better extract potential information.
In each unit, we employ spatial and temporal separable 3D convolution to extract spatial and spectral information, which not only reduces unaffordable memory usage and high computational cost, but also makes the network easier to train.
arXiv Detail & Related papers (2020-01-14T03:34:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.