Neuromorphic Camera Denoising using Graph Neural Network-driven
Transformers
- URL: http://arxiv.org/abs/2112.09685v1
- Date: Fri, 17 Dec 2021 18:57:36 GMT
- Title: Neuromorphic Camera Denoising using Graph Neural Network-driven
Transformers
- Authors: Yusra Alkendi, Rana Azzam, Abdulla Ayyad, Sajid Javed, Lakmal
Seneviratne, and Yahya Zweiri
- Abstract summary: Neuromorphic vision is a bio-inspired technology that has triggered a paradigm shift in the computer-vision community.
Neuromorphic cameras suffer from significant amounts of measurement noise.
This noise deteriorates the performance of neuromorphic event-based perception and navigation algorithms.
- Score: 3.805262583092311
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic vision is a bio-inspired technology that has triggered a
paradigm shift in the computer-vision community and is serving as a key-enabler
for a multitude of applications. This technology has offered significant
advantages including reduced power consumption, reduced processing needs, and
communication speed-ups. However, neuromorphic cameras suffer from significant
amounts of measurement noise. This noise deteriorates the performance of
neuromorphic event-based perception and navigation algorithms. In this paper,
we propose a novel noise filtration algorithm to eliminate events which do not
represent real log-intensity variations in the observed scene. We employ a
Graph Neural Network (GNN)-driven transformer algorithm, called
GNN-Transformer, to classify every active event pixel in the raw stream into
real-log intensity variation or noise. Within the GNN, a message-passing
framework, called EventConv, is carried out to reflect the spatiotemporal
correlation among the events, while preserving their asynchronous nature. We
also introduce the Known-object Ground-Truth Labeling (KoGTL) approach for
generating approximate ground truth labels of event streams under various
illumination conditions. KoGTL is used to generate labeled datasets, from
experiments recorded in challenging lighting conditions. These datasets are
used to train and extensively test our proposed algorithm. When tested on
unseen datasets, the proposed algorithm outperforms existing methods by 12% in
terms of filtration accuracy. Additional tests are also conducted on publicly
available datasets to demonstrate the generalization capabilities of the
proposed algorithm in the presence of illumination variations and different
motion dynamics. Compared to existing solutions, qualitative results verified
the superior capability of the proposed algorithm to eliminate noise while
preserving meaningful scene events.
Related papers
- Noise Filtering Benchmark for Neuromorphic Satellites Observations [39.781091151259766]
Event cameras capture sparse, asynchronous brightness changes which offer high temporal resolution, high dynamic range, low power consumption, and sparse data output.
These advantages make them ideal for Space Situational Awareness, particularly in detecting resident space objects moving within a telescope's field of view.
However, the output from event cameras often includes substantial background activity noise, which is known to be more prevalent in low-light conditions.
This noise can overwhelm the sparse events generated by satellite signals, making detection and tracking more challenging.
arXiv Detail & Related papers (2024-11-18T02:02:24Z) - Neuromorphic Vision-based Motion Segmentation with Graph Transformer Neural Network [4.386534439007928]
We propose a novel event-based motion segmentation algorithm using a Graph Transformer Neural Network, dubbed GTNN.
Our proposed algorithm processes event streams as 3D graphs by a series nonlinear transformations to unveil local and global correlations between events.
We show that GTNN outperforms state-of-the-art methods in the presence of dynamic background variations, motion patterns, and multiple dynamic objects with varying sizes and velocities.
arXiv Detail & Related papers (2024-04-16T22:44:29Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Hyperspectral Image Denoising via Self-Modulating Convolutional Neural
Networks [15.700048595212051]
We introduce a self-modulating convolutional neural network which utilizes correlated spectral and spatial information.
At the core of the model lies a novel block, which allows the network to transform the features in an adaptive manner based on the adjacent spectral data.
Experimental analysis on both synthetic and real data shows that the proposed SM-CNN outperforms other state-of-the-art HSI denoising methods.
arXiv Detail & Related papers (2023-09-15T06:57:43Z) - Advancing Unsupervised Low-light Image Enhancement: Noise Estimation, Illumination Interpolation, and Self-Regulation [55.07472635587852]
Low-Light Image Enhancement (LLIE) techniques have made notable advancements in preserving image details and enhancing contrast.
These approaches encounter persistent challenges in efficiently mitigating dynamic noise and accommodating diverse low-light scenarios.
We first propose a method for estimating the noise level in low light images in a quick and accurate way.
We then devise a Learnable Illumination Interpolator (LII) to satisfy general constraints between illumination and input.
arXiv Detail & Related papers (2023-05-17T13:56:48Z) - Degradation-Noise-Aware Deep Unfolding Transformer for Hyperspectral
Image Denoising [9.119226249676501]
Hyperspectral images (HSIs) are often quite noisy because of narrow band spectral filtering.
To reduce the noise in HSI data cubes, both model-driven and learning-based denoising algorithms have been proposed.
This paper proposes a Degradation-Noise-Aware Unfolding Network (DNA-Net) that addresses these issues.
arXiv Detail & Related papers (2023-05-06T13:28:20Z) - Multi-stage image denoising with the wavelet transform [125.2251438120701]
Deep convolutional neural networks (CNNs) are used for image denoising via automatically mining accurate structure information.
We propose a multi-stage image denoising CNN with the wavelet transform (MWDCNN) via three stages, i.e., a dynamic convolutional block (DCB), two cascaded wavelet transform and enhancement blocks (WEBs) and residual block (RB)
arXiv Detail & Related papers (2022-09-26T03:28:23Z) - Practical Blind Image Denoising via Swin-Conv-UNet and Data Synthesis [148.16279746287452]
We propose a swin-conv block to incorporate the local modeling ability of residual convolutional layer and non-local modeling ability of swin transformer block.
For the training data synthesis, we design a practical noise degradation model which takes into consideration different kinds of noise.
Experiments on AGWN removal and real image denoising demonstrate that the new network architecture design achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-03-24T18:11:31Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - SignalNet: A Low Resolution Sinusoid Decomposition and Estimation
Network [79.04274563889548]
We propose SignalNet, a neural network architecture that detects the number of sinusoids and estimates their parameters from quantized in-phase and quadrature samples.
We introduce a worst-case learning threshold for comparing the results of our network relative to the underlying data distributions.
In simulation, we find that our algorithm is always able to surpass the threshold for three-bit data but often cannot exceed the threshold for one-bit data.
arXiv Detail & Related papers (2021-06-10T04:21:20Z) - Robust Processing-In-Memory Neural Networks via Noise-Aware
Normalization [26.270754571140735]
PIM accelerators often suffer from intrinsic noise in the physical components.
We propose a noise-agnostic method to achieve robust neural network performance against any noise setting.
arXiv Detail & Related papers (2020-07-07T06:51:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.