Spiking sampling network for image sparse representation and dynamic
vision sensor data compression
- URL: http://arxiv.org/abs/2211.04166v1
- Date: Tue, 8 Nov 2022 11:11:10 GMT
- Title: Spiking sampling network for image sparse representation and dynamic
vision sensor data compression
- Authors: Chunming Jiang, Yilei Zhang
- Abstract summary: Sparse representation has attracted great attention because it can greatly save storage re- sources and find representative features of data in a low-dimensional space.
In this paper, we propose a spiking sampling network.
This network is composed of spiking neurons, and it can dynamically decide which pixel points should be retained and which ones need to be masked according to the input.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sparse representation has attracted great attention because it can greatly
save storage re- sources and find representative features of data in a
low-dimensional space. As a result, it may be widely applied in engineering
domains including feature extraction, compressed sensing, signal denoising,
picture clustering, and dictionary learning, just to name a few. In this paper,
we propose a spiking sampling network. This network is composed of spiking
neurons, and it can dynamically decide which pixel points should be retained
and which ones need to be masked according to the input. Our experiments
demonstrate that this approach enables better sparse representation of the
original image and facilitates image reconstruction compared to random
sampling. We thus use this approach for compressing massive data from the
dynamic vision sensor, which greatly reduces the storage requirements for event
data.
Related papers
- Streaming Neural Images [56.41827271721955]
Implicit Neural Representations (INRs) are a novel paradigm for signal representation that have attracted considerable interest for image compression.
In this work, we explore the critical yet overlooked limiting factors of INRs, such as computational cost, unstable performance, and robustness.
arXiv Detail & Related papers (2024-09-25T17:51:20Z) - SHACIRA: Scalable HAsh-grid Compression for Implicit Neural
Representations [46.01969382873856]
Implicit Neural Representations (INR) or neural fields have emerged as a popular framework to encode multimedia signals.
We propose SHACIRA, a framework for compressing such feature grids with no additional post-hoc pruning/quantization stages.
Our approach outperforms existing INR approaches without the need for any large datasets or domain-specifics.
arXiv Detail & Related papers (2023-09-27T17:59:48Z) - Beyond Learned Metadata-based Raw Image Reconstruction [86.1667769209103]
Raw images have distinct advantages over sRGB images, e.g., linearity and fine-grained quantization levels.
They are not widely adopted by general users due to their substantial storage requirements.
We propose a novel framework that learns a compact representation in the latent space, serving as metadata.
arXiv Detail & Related papers (2023-06-21T06:59:07Z) - Compression with Bayesian Implicit Neural Representations [16.593537431810237]
We propose overfitting variational neural networks to the data and compressing an approximate posterior weight sample using relative entropy coding instead of quantizing and entropy coding it.
Experiments show that our method achieves strong performance on image and audio compression while retaining simplicity.
arXiv Detail & Related papers (2023-05-30T16:29:52Z) - Raw Image Reconstruction with Learned Compact Metadata [61.62454853089346]
We propose a novel framework to learn a compact representation in the latent space serving as the metadata in an end-to-end manner.
We show how the proposed raw image compression scheme can adaptively allocate more bits to image regions that are important from a global perspective.
arXiv Detail & Related papers (2023-02-25T05:29:45Z) - Hyperspectral Image Compression Using Implicit Neural Representation [1.4721615285883425]
This paper develops a method for hyperspectral image compression using implicit neural representations.
We show the proposed method achieves better compression than JPEG, JPEG2000, and PCA-DCT at lows.
arXiv Detail & Related papers (2023-02-08T15:27:00Z) - Variable Bitrate Neural Fields [75.24672452527795]
We present a dictionary method for compressing feature grids, reducing their memory consumption by up to 100x.
We formulate the dictionary optimization as a vector-quantized auto-decoder problem which lets us learn end-to-end discrete neural representations in a space where no direct supervision is available.
arXiv Detail & Related papers (2022-06-15T17:58:34Z) - Deep data compression for approximate ultrasonic image formation [1.0266286487433585]
In ultrasonic imaging systems, data acquisition and image formation are performed on separate computing devices.
Deep neural networks are optimized to preserve the image quality of a particular image formation method.
arXiv Detail & Related papers (2020-09-04T16:43:12Z) - Impression Space from Deep Template Network [72.86001835304185]
We show that a trained convolutional neural network has the capability to "remember" its input images.
We propose a framework to establish an emphImpression Space upon an off-the-shelf pretrained network.
arXiv Detail & Related papers (2020-07-10T15:29:33Z) - Neural Sparse Representation for Image Restoration [116.72107034624344]
Inspired by the robustness and efficiency of sparse coding based image restoration models, we investigate the sparsity of neurons in deep networks.
Our method structurally enforces sparsity constraints upon hidden neurons.
Experiments show that sparse representation is crucial in deep neural networks for multiple image restoration tasks.
arXiv Detail & Related papers (2020-06-08T05:15:17Z) - Distributed Learning and Inference with Compressed Images [40.07509530656681]
This paper focuses on vision-based perception for autonomous driving as a paradigmatic scenario.
We propose dataset restoration, based on image restoration with generative adversarial networks (GANs)
Our method is agnostic to both the particular image compression method and the downstream task.
arXiv Detail & Related papers (2020-04-22T11:20:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.