Spiking Generative Adversarial Network with Attention Scoring Decoding
- URL: http://arxiv.org/abs/2305.10246v3
- Date: Fri, 19 May 2023 08:08:42 GMT
- Title: Spiking Generative Adversarial Network with Attention Scoring Decoding
- Authors: Linghao Feng, Dongcheng Zhao, Yi Zeng
- Abstract summary: Spiking neural networks offer a closer approximation to brain-like processing.
We build a spiking generative adversarial network capable of handling complex images.
- Score: 4.5727987473456055
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative models based on neural networks present a substantial challenge
within deep learning. As it stands, such models are primarily limited to the
domain of artificial neural networks. Spiking neural networks, as the third
generation of neural networks, offer a closer approximation to brain-like
processing due to their rich spatiotemporal dynamics. However, generative
models based on spiking neural networks are not well studied. In this work, we
pioneer constructing a spiking generative adversarial network capable of
handling complex images. Our first task was to identify the problems of
out-of-domain inconsistency and temporal inconsistency inherent in spiking
generative adversarial networks. We addressed these issues by incorporating the
Earth-Mover distance and an attention-based weighted decoding method,
significantly enhancing the performance of our algorithm across several
datasets. Experimental results reveal that our approach outperforms existing
methods on the MNIST, FashionMNIST, CIFAR10, and CelebA datasets. Moreover,
compared with hybrid spiking generative adversarial networks, where the
discriminator is an artificial analog neural network, our methodology
demonstrates closer alignment with the information processing patterns observed
in the mouse.
Related papers
- Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Expressivity of Spiking Neural Networks [15.181458163440634]
We study the capabilities of spiking neural networks where information is encoded in the firing time of neurons.
In contrast to ReLU networks, we prove that spiking neural networks can realize both continuous and discontinuous functions.
arXiv Detail & Related papers (2023-08-16T08:45:53Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Spiking Neural Networks for Frame-based and Event-based Single Object
Localization [26.51843464087218]
Spiking neural networks have shown much promise as an energy-efficient alternative to artificial neural networks.
We propose a spiking neural network approach for single object localization trained using surrogate gradient descent.
We compare our method with similar artificial neural networks and show that our model has competitive/better performance in accuracy, against various corruptions, and has lower energy consumption.
arXiv Detail & Related papers (2022-06-13T22:22:32Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Creating Powerful and Interpretable Models withRegression Networks [2.2049183478692584]
We propose a novel architecture, Regression Networks, which combines the power of neural networks with the understandability of regression analysis.
We demonstrate that the models exceed the state-of-the-art performance of interpretable models on several benchmark datasets.
arXiv Detail & Related papers (2021-07-30T03:37:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.