Image segmentation with traveling waves in an exactly solvable recurrent
neural network
- URL: http://arxiv.org/abs/2311.16943v1
- Date: Tue, 28 Nov 2023 16:46:44 GMT
- Title: Image segmentation with traveling waves in an exactly solvable recurrent
neural network
- Authors: Luisa H. B. Liboni, Roberto C. Budzinski, Alexandra N. Busch, Sindy
L\"owe, Thomas A. Keller, Max Welling, Lyle E. Muller
- Abstract summary: We show that a recurrent neural network can effectively divide an image into groups according to a scene's structural characteristics.
We present a precise description of the mechanism underlying object segmentation in this network.
We then demonstrate a simple algorithm for object segmentation that generalizes across inputs ranging from simple geometric objects in grayscale images to natural images.
- Score: 71.74150501418039
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study image segmentation using spatiotemporal dynamics in a recurrent
neural network where the state of each unit is given by a complex number. We
show that this network generates sophisticated spatiotemporal dynamics that can
effectively divide an image into groups according to a scene's structural
characteristics. Using an exact solution of the recurrent network's dynamics,
we present a precise description of the mechanism underlying object
segmentation in this network, providing a clear mathematical interpretation of
how the network performs this task. We then demonstrate a simple algorithm for
object segmentation that generalizes across inputs ranging from simple
geometric objects in grayscale images to natural images. Object segmentation
across all images is accomplished with one recurrent neural network that has a
single, fixed set of weights. This demonstrates the expressive potential of
recurrent neural networks when constructed using a mathematical approach that
brings together their structure, dynamics, and computation.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - The inverse problem for neural networks [3.2634122554914002]
We study the problem of computing the preimage of a set under a neural network with piecewise-affine activation functions.
We show several applications of computing the preimage for analysis and interpretability of neural networks.
arXiv Detail & Related papers (2023-08-27T12:35:38Z) - Aesthetics and neural network image representations [0.0]
We analyze the spaces of images encoded by generative neural networks of the BigGAN architecture.
We find that generic multiplicative perturbations of neural network parameters away from the photo-realistic point often lead to networks generating images which appear as "artistic renditions" of the corresponding objects.
arXiv Detail & Related papers (2021-09-16T16:50:22Z) - Joint Learning of Neural Transfer and Architecture Adaptation for Image
Recognition [77.95361323613147]
Current state-of-the-art visual recognition systems rely on pretraining a neural network on a large-scale dataset and finetuning the network weights on a smaller dataset.
In this work, we prove that dynamically adapting network architectures tailored for each domain task along with weight finetuning benefits in both efficiency and effectiveness.
Our method can be easily generalized to an unsupervised paradigm by replacing supernet training with self-supervised learning in the source domain tasks and performing linear evaluation in the downstream tasks.
arXiv Detail & Related papers (2021-03-31T08:15:17Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - Understanding the Role of Individual Units in a Deep Neural Network [85.23117441162772]
We present an analytic framework to systematically identify hidden units within image classification and image generation networks.
First, we analyze a convolutional neural network (CNN) trained on scene classification and discover units that match a diverse set of object concepts.
Second, we use a similar analytic method to analyze a generative adversarial network (GAN) model trained to generate scenes.
arXiv Detail & Related papers (2020-09-10T17:59:10Z) - The Representation Theory of Neural Networks [7.724617675868718]
We show that neural networks can be represented via the mathematical theory of quiver representations.
We show that network quivers gently adapt to common neural network concepts.
We also provide a quiver representation model to understand how a neural network creates representations from the data.
arXiv Detail & Related papers (2020-07-23T19:02:14Z) - CRNet: Cross-Reference Networks for Few-Shot Segmentation [59.85183776573642]
Few-shot segmentation aims to learn a segmentation model that can be generalized to novel classes with only a few training images.
With a cross-reference mechanism, our network can better find the co-occurrent objects in the two images.
Experiments on the PASCAL VOC 2012 dataset show that our network achieves state-of-the-art performance.
arXiv Detail & Related papers (2020-03-24T04:55:43Z) - Geometric Approaches to Increase the Expressivity of Deep Neural
Networks for MR Reconstruction [41.62169556793355]
Deep learning approaches have been extensively investigated to reconstruct images from accelerated magnetic resonance image (MRI) acquisition.
It is not clear how to choose a suitable network architecture to balance the trade-off between network complexity and performance.
This paper proposes a systematic geometric approach using bootstrapping and subnetwork aggregation to increase the expressivity of the underlying neural network.
arXiv Detail & Related papers (2020-03-17T14:18:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.