Convolutional Neural Generative Coding: Scaling Predictive Coding to
Natural Images
- URL: http://arxiv.org/abs/2211.12047v1
- Date: Tue, 22 Nov 2022 06:42:41 GMT
- Title: Convolutional Neural Generative Coding: Scaling Predictive Coding to
Natural Images
- Authors: Alexander Ororbia, Ankur Mali
- Abstract summary: We develop convolutional neural generative coding (Conv-NGC)
We implement a flexible neurobiologically-motivated algorithm that progressively refines latent state maps.
We study the effectiveness of our brain-inspired neural system on the tasks of reconstruction and image denoising.
- Score: 79.07468367923619
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we develop convolutional neural generative coding (Conv-NGC), a
generalization of predictive coding to the case of
convolution/deconvolution-based computation. Specifically, we concretely
implement a flexible neurobiologically-motivated algorithm that progressively
refines latent state maps in order to dynamically form a more accurate internal
representation/reconstruction model of natural images. The performance of the
resulting sensory processing system is evaluated on several benchmark datasets
such as Color-MNIST, CIFAR-10, and Street House View Numbers (SVHN). We study
the effectiveness of our brain-inspired neural system on the tasks of
reconstruction and image denoising and find that it is competitive with
convolutional auto-encoding systems trained by backpropagation of errors and
notably outperforms them with respect to out-of-distribution reconstruction
(including on the full 90k CINIC-10 test set).
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Neural information coding for efficient spike-based image denoising [0.5156484100374058]
In this work we investigate Spiking Neural Networks (SNNs) for Gaussian denoising.
We propose a formal analysis of the information conversion processing carried out by the Leaky Integrate and Fire (LIF) neurons.
We compare its performance with the classical rate-coding mechanism.
Our results show that SNNs with LIF neurons can provide competitive denoising performance but at a reduced computational cost.
arXiv Detail & Related papers (2023-05-15T09:05:32Z) - Exploring the Rate-Distortion-Complexity Optimization in Neural Image
Compression [26.1947289647201]
We study the rate-distortion-complexity (RDC) optimization in neural image compression.
By quantifying the decoding complexity as a factor in the optimization goal, we are now able to precisely control the RDC trade-off.
A variable-complexity neural is designed to leverage the spatial dependencies adaptively according to industrial demands.
arXiv Detail & Related papers (2023-05-12T03:56:25Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - A Deep Learning-based in silico Framework for Optimization on Retinal
Prosthetic Stimulation [3.870538485112487]
We propose a neural network-based framework to optimize the perceptions simulated by the in silico retinal implant model pulse2percept.
The pipeline consists of a trainable encoder, a pre-trained retinal implant model and a pre-trained evaluator.
arXiv Detail & Related papers (2023-02-07T16:32:05Z) - Modality-Agnostic Variational Compression of Implicit Neural
Representations [96.35492043867104]
We introduce a modality-agnostic neural compression algorithm based on a functional view of data and parameterised as an Implicit Neural Representation (INR)
Bridging the gap between latent coding and sparsity, we obtain compact latent representations non-linearly mapped to a soft gating mechanism.
After obtaining a dataset of such latent representations, we directly optimise the rate/distortion trade-off in a modality-agnostic space using neural compression.
arXiv Detail & Related papers (2023-01-23T15:22:42Z) - The Predictive Forward-Forward Algorithm [79.07468367923619]
We propose the predictive forward-forward (PFF) algorithm for conducting credit assignment in neural systems.
We design a novel, dynamic recurrent neural system that learns a directed generative circuit jointly and simultaneously with a representation circuit.
PFF efficiently learns to propagate learning signals and updates synapses with forward passes only.
arXiv Detail & Related papers (2023-01-04T05:34:48Z) - Learning Optimal K-space Acquisition and Reconstruction using
Physics-Informed Neural Networks [46.751292014516025]
Deep neural networks have been applied to reconstruct undersampled k-space data and have shown improved reconstruction performance.
This work proposes a novel framework to learn k-space sampling trajectories by considering it as an Ordinary Differential Equation (ODE) problem.
Experiments were conducted on different in-viv datasets (textite.g., brain and knee images) acquired with different sequences.
arXiv Detail & Related papers (2022-04-05T20:28:42Z) - Convolutional Analysis Operator Learning by End-To-End Training of
Iterative Neural Networks [3.6280929178575994]
We show how convolutional sparsifying filters can be efficiently learned by end-to-end training of iterative neural networks.
We evaluated our approach on a non-Cartesian 2D cardiac cine MRI example and show that the obtained filters are better suitable for the corresponding reconstruction algorithm than the ones obtained by decoupled pre-training.
arXiv Detail & Related papers (2022-03-04T07:32:16Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.