Signal Processing for Implicit Neural Representations
- URL: http://arxiv.org/abs/2210.08772v1
- Date: Mon, 17 Oct 2022 06:29:07 GMT
- Title: Signal Processing for Implicit Neural Representations
- Authors: Dejia Xu, Peihao Wang, Yifan Jiang, Zhiwen Fan, Zhangyang Wang
- Abstract summary: Implicit Neural Representations (INRs) encode continuous multi-media data via multi-layer perceptrons.
Existing works manipulate such continuous representations via processing on their discretized instance.
We propose an implicit neural signal processing network, dubbed INSP-Net, via differential operators on INR.
- Score: 80.38097216996164
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Implicit Neural Representations (INRs) encoding continuous multi-media data
via multi-layer perceptrons has shown undebatable promise in various computer
vision tasks. Despite many successful applications, editing and processing an
INR remains intractable as signals are represented by latent parameters of a
neural network. Existing works manipulate such continuous representations via
processing on their discretized instance, which breaks down the compactness and
continuous nature of INR. In this work, we present a pilot study on the
question: how to directly modify an INR without explicit decoding? We answer
this question by proposing an implicit neural signal processing network, dubbed
INSP-Net, via differential operators on INR. Our key insight is that spatial
gradients of neural networks can be computed analytically and are invariant to
translation, while mathematically we show that any continuous convolution
filter can be uniformly approximated by a linear combination of high-order
differential operators. With these two knobs, INSP-Net instantiates the signal
processing operator as a weighted composition of computational graphs
corresponding to the high-order derivatives of INRs, where the weighting
parameters can be data-driven learned. Based on our proposed INSP-Net, we
further build the first Convolutional Neural Network (CNN) that implicitly runs
on INRs, named INSP-ConvNet. Our experiments validate the expressiveness of
INSP-Net and INSP-ConvNet in fitting low-level image and geometry processing
kernels (e.g. blurring, deblurring, denoising, inpainting, and smoothening) as
well as for high-level tasks on implicit fields such as image classification.
Related papers
- Locality-Aware Generalizable Implicit Neural Representation [54.93702310461174]
Generalizable implicit neural representation (INR) enables a single continuous function to represent multiple data instances.
We propose a novel framework for generalizable INR that combines a transformer encoder with a locality-aware INR decoder.
Our framework significantly outperforms previous generalizable INRs and validates the usefulness of the locality-aware latents for downstream tasks.
arXiv Detail & Related papers (2023-10-09T11:26:58Z) - Modality-Agnostic Variational Compression of Implicit Neural
Representations [96.35492043867104]
We introduce a modality-agnostic neural compression algorithm based on a functional view of data and parameterised as an Implicit Neural Representation (INR)
Bridging the gap between latent coding and sparsity, we obtain compact latent representations non-linearly mapped to a soft gating mechanism.
After obtaining a dataset of such latent representations, we directly optimise the rate/distortion trade-off in a modality-agnostic space using neural compression.
arXiv Detail & Related papers (2023-01-23T15:22:42Z) - Versatile Neural Processes for Learning Implicit Neural Representations [57.090658265140384]
We propose Versatile Neural Processes (VNP), which largely increases the capability of approximating functions.
Specifically, we introduce a bottleneck encoder that produces fewer and informative context tokens, relieving the high computational cost.
We demonstrate the effectiveness of the proposed VNP on a variety of tasks involving 1D, 2D and 3D signals.
arXiv Detail & Related papers (2023-01-21T04:08:46Z) - Transformers as Meta-Learners for Implicit Neural Representations [10.673855995948736]
Implicit Neural Representations (INRs) have emerged and shown their benefits over discrete representations in recent years.
We propose a formulation that uses Transformers as hypernetworks for INRs, where it can directly build the whole set of INR weights.
We demonstrate the effectiveness of our method for building INRs in different tasks and domains, including 2D image regression and view synthesis for 3D objects.
arXiv Detail & Related papers (2022-08-04T17:54:38Z) - Sobolev Training for Implicit Neural Representations with Approximated
Image Derivatives [12.71676484494428]
Implicit Neural Representations (INRs) parameterized by neural networks have emerged as a powerful tool to represent different kinds of signals.
We propose a training paradigm for INRs whose target output is image pixels, to encode image derivatives in addition to image values in the neural network.
We show how the training paradigm can be leveraged to solve typical INRs problems, i.e., image regression and inverse rendering.
arXiv Detail & Related papers (2022-07-21T10:12:41Z) - Neural Implicit Dictionary via Mixture-of-Expert Training [111.08941206369508]
We present a generic INR framework that achieves both data and training efficiency by learning a Neural Implicit Dictionary (NID)
Our NID assembles a group of coordinate-based Impworks which are tuned to span the desired function space.
Our experiments show that, NID can improve reconstruction of 2D images or 3D scenes by 2 orders of magnitude faster with up to 98% less input data.
arXiv Detail & Related papers (2022-07-08T05:07:19Z) - Variational models for signal processing with Graph Neural Networks [3.5939555573102853]
This paper is devoted to signal processing on point-clouds by means of neural networks.
In this work, we investigate the use of variational models for such Graph Neural Networks to process signals on graphs for unsupervised learning.
arXiv Detail & Related papers (2021-03-30T13:31:11Z) - Deep Networks for Direction-of-Arrival Estimation in Low SNR [89.45026632977456]
We introduce a Convolutional Neural Network (CNN) that is trained from mutli-channel data of the true array manifold matrix.
We train a CNN in the low-SNR regime to predict DoAs across all SNRs.
Our robust solution can be applied in several fields, ranging from wireless array sensors to acoustic microphones or sonars.
arXiv Detail & Related papers (2020-11-17T12:52:18Z) - Operational vs Convolutional Neural Networks for Image Denoising [25.838282412957675]
Convolutional Neural Networks (CNNs) have recently become a favored technique for image denoising due to its adaptive learning ability.
We propose a heterogeneous network model which allows greater flexibility for embedding additional non-linearity at the core of the data transformation.
An extensive set of comparative evaluations of ONNs and CNNs over two severe image denoising problems yield conclusive evidence that ONNs enriched by non-linear operators can achieve a superior denoising performance against CNNs with both equivalent and well-known deep configurations.
arXiv Detail & Related papers (2020-09-01T12:15:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.