Gappy local conformal auto-encoders for heterogeneous data fusion: in
praise of rigidity
- URL: http://arxiv.org/abs/2312.13155v1
- Date: Wed, 20 Dec 2023 16:18:51 GMT
- Title: Gappy local conformal auto-encoders for heterogeneous data fusion: in
praise of rigidity
- Authors: Erez Peterfreund, Iryna Burak, Ofir Lindenbaum, Jim Gimlett, Felix
Dietrich, Ronald R. Coifman, Ioannis G. Kevrekidis
- Abstract summary: We propose an end-to-end computational pipeline in the form of a multiple-auto-encoder neural network architecture for this task.
The inputs to the pipeline are several sets of partial observations, and the result is a globally consistent latent space, harmonizing (rigidifying, fusing) all measurements.
We demonstrate the approach in a sequence of examples, starting with simple two-dimensional data sets and proceeding to a Wi-Fi localization problem.
- Score: 6.1152340690876095
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fusing measurements from multiple, heterogeneous, partial sources, observing
a common object or process, poses challenges due to the increasing availability
of numbers and types of sensors. In this work we propose, implement and
validate an end-to-end computational pipeline in the form of a
multiple-auto-encoder neural network architecture for this task. The inputs to
the pipeline are several sets of partial observations, and the result is a
globally consistent latent space, harmonizing (rigidifying, fusing) all
measurements. The key enabler is the availability of multiple slightly
perturbed measurements of each instance:, local measurement, "bursts", that
allows us to estimate the local distortion induced by each instrument. We
demonstrate the approach in a sequence of examples, starting with simple
two-dimensional data sets and proceeding to a Wi-Fi localization problem and to
the solution of a "dynamical puzzle" arising in spatio-temporal observations of
the solutions of Partial Differential Equations.
Related papers
- Mesh Denoising Transformer [104.5404564075393]
Mesh denoising is aimed at removing noise from input meshes while preserving their feature structures.
SurfaceFormer is a pioneering Transformer-based mesh denoising framework.
New representation known as Local Surface Descriptor captures local geometric intricacies.
Denoising Transformer module receives the multimodal information and achieves efficient global feature aggregation.
arXiv Detail & Related papers (2024-05-10T15:27:43Z) - D2NO: Efficient Handling of Heterogeneous Input Function Spaces with
Distributed Deep Neural Operators [7.119066725173193]
We propose a novel distributed approach to deal with input functions that exhibit heterogeneous properties.
A central neural network is used to handle shared information across all output functions.
We demonstrate that the corresponding neural network is a universal approximator of continuous nonlinear operators.
arXiv Detail & Related papers (2023-10-29T03:29:59Z) - Compatible Transformer for Irregularly Sampled Multivariate Time Series [75.79309862085303]
We propose a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample.
We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods.
arXiv Detail & Related papers (2023-10-17T06:29:09Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Convergent autoencoder approximation of low bending and low distortion
manifold embeddings [5.5711773076846365]
We propose and analyze a novel regularization for learning the encoder component of an autoencoder.
The loss functional is computed via Monte Carlo integration with different sampling strategies for pairs of points on the input manifold.
Our main theorem identifies a loss functional of the embedding map as the $Gamma$-limit of the sampling-dependent loss functionals.
arXiv Detail & Related papers (2022-08-22T10:31:31Z) - DPCN++: Differentiable Phase Correlation Network for Versatile Pose
Registration [18.60311260250232]
We present a differentiable phase correlation solver that is globally convergent and correspondence-free.
We evaluate DCPN++ on a wide range of registration tasks taking different input modalities, including 2D bird's-eye view images, 3D object and scene measurements, and medical images.
arXiv Detail & Related papers (2022-06-12T10:00:34Z) - Push--Pull with Device Sampling [8.344476599818826]
We consider decentralized optimization problems in which a number of agents collaborate to minimize the average of their local functions by exchanging over an underlying communication graph.
We propose an algorithm that combines gradient tracking and variance reduction over the entire network.
Our theoretical analysis shows that the algorithm converges linearly, when the local objective functions are strongly convex.
arXiv Detail & Related papers (2022-06-08T18:18:18Z) - Deep Federated Anomaly Detection for Multivariate Time Series Data [93.08977495974978]
We present a Federated Exemplar-based Deep Neural Network (Fed-ExDNN) to conduct anomaly detection for multivariate time series data on different edge devices.
We show that ExDNN and Fed-ExDNN can outperform state-of-the-art anomaly detection algorithms and federated learning techniques.
arXiv Detail & Related papers (2022-05-09T05:06:58Z) - Federated Learning Based on Dynamic Regularization [43.137064459520886]
We propose a novel federated learning method for distributively training neural network models.
Server orchestrates cooperation between a subset of randomly chosen devices in each round.
arXiv Detail & Related papers (2021-11-08T03:58:28Z) - Solving Sparse Linear Inverse Problems in Communication Systems: A Deep
Learning Approach With Adaptive Depth [51.40441097625201]
We propose an end-to-end trainable deep learning architecture for sparse signal recovery problems.
The proposed method learns how many layers to execute to emit an output, and the network depth is dynamically adjusted for each task in the inference phase.
arXiv Detail & Related papers (2020-10-29T06:32:53Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.