NRTR: Neuron Reconstruction with Transformer from 3D Optical Microscopy
Images
- URL: http://arxiv.org/abs/2212.04163v1
- Date: Thu, 8 Dec 2022 09:35:22 GMT
- Title: NRTR: Neuron Reconstruction with Transformer from 3D Optical Microscopy
Images
- Authors: Yijun Wang, Rui Lang, Rui Li and Junsong Zhang
- Abstract summary: We propose a Neuron Reconstruction Transformer (NRTR) that views neuron reconstruction as a direct set-prediction problem.
NRTR is the first image-to-set deep learning model for end-to-end neuron reconstruction.
- Score: 5.724034347184251
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The neuron reconstruction from raw Optical Microscopy (OM) image stacks is
the basis of neuroscience. Manual annotation and semi-automatic neuron tracing
algorithms are time-consuming and inefficient. Existing deep learning neuron
reconstruction methods, although demonstrating exemplary performance, greatly
demand complex rule-based components. Therefore, a crucial challenge is
designing an end-to-end neuron reconstruction method that makes the overall
framework simpler and model training easier. We propose a Neuron Reconstruction
Transformer (NRTR) that, discarding the complex rule-based components, views
neuron reconstruction as a direct set-prediction problem. To the best of our
knowledge, NRTR is the first image-to-set deep learning model for end-to-end
neuron reconstruction. In experiments using the BigNeuron and VISoR-40
datasets, NRTR achieves excellent neuron reconstruction results for
comprehensive benchmarks and outperforms competitive baselines. Results of
extensive experiments indicate that NRTR is effective at showing that neuron
reconstruction is viewed as a set-prediction problem, which makes end-to-end
model training available.
Related papers
- ReLUs Are Sufficient for Learning Implicit Neural Representations [17.786058035763254]
We revisit the use of ReLU activation functions for learning implicit neural representations.
Inspired by second order B-spline wavelets, we incorporate a set of simple constraints to the ReLU neurons in each layer of a deep neural network (DNN)
We demonstrate that, contrary to popular belief, one can learn state-of-the-art INRs based on a DNN composed of only ReLU neurons.
arXiv Detail & Related papers (2024-06-04T17:51:08Z) - Boosting 3D Neuron Segmentation with 2D Vision Transformer Pre-trained on Natural Images [10.790999324557179]
We propose a novel training paradigm that leverages a 2D Vision Transformer model pre-trained on large-scale natural images.
Our method builds a knowledge sharing connection between the abundant natural and the scarce neuron image domains to improve the 3D neuron segmentation ability.
Evaluated on a popular benchmark, BigNeuron, our method enhances neuron segmentation performance by 8.71% over the model trained from scratch.
arXiv Detail & Related papers (2024-05-04T14:57:28Z) - On The Expressivity of Recurrent Neural Cascades [48.87943990557107]
Recurrent Neural Cascades (RNCs) are the recurrent neural networks with no cyclic dependencies among recurrent neurons.
We show that RNCs can achieve the expressivity of all regular languages by introducing neurons that can implement groups.
arXiv Detail & Related papers (2023-12-14T15:47:26Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Convolutional Neural Generative Coding: Scaling Predictive Coding to
Natural Images [79.07468367923619]
We develop convolutional neural generative coding (Conv-NGC)
We implement a flexible neurobiologically-motivated algorithm that progressively refines latent state maps.
We study the effectiveness of our brain-inspired neural system on the tasks of reconstruction and image denoising.
arXiv Detail & Related papers (2022-11-22T06:42:41Z) - A Long Short-term Memory Based Recurrent Neural Network for
Interventional MRI Reconstruction [50.1787181309337]
We propose a convolutional long short-term memory (Conv-LSTM) based recurrent neural network (RNN), or ConvLR, to reconstruct interventional images with golden-angle radial sampling.
The proposed algorithm has the potential to achieve real-time i-MRI for DBS and can be used for general purpose MR-guided intervention.
arXiv Detail & Related papers (2022-03-28T14:03:45Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Recurrent networks improve neural response prediction and provide
insights into underlying cortical circuits [3.340380180141713]
CNN models have proven themselves as state-of-the-art models for predicting single-neuron responses to natural images in early visual cortical neurons.
We extend these models with recurrent convolutional layers, reflecting the well-known massive recurrence in the cortex.
We find that the hidden units in the recurrent circuits of the appropriate models, when trained on long-duration wide-field image presentations, exhibit similar temporal response dynamics and classical contextual modulations as observed in V1 neurons.
arXiv Detail & Related papers (2021-10-02T15:46:56Z) - Voxel-wise Cross-Volume Representation Learning for 3D Neuron
Reconstruction [27.836007480393953]
We propose a novel voxel-level cross-volume representation learning paradigm on the basis of an encoder-decoder segmentation model.
Our method introduces no extra cost during inference.
Evaluated on 42 3D neuron images from BigNeuron project, our proposed method is demonstrated to improve the learning ability of the original segmentation model.
arXiv Detail & Related papers (2021-08-14T12:17:45Z) - Over-and-Under Complete Convolutional RNN for MRI Reconstruction [57.95363471940937]
Recent deep learning-based methods for MR image reconstruction usually leverage a generic auto-encoder architecture.
We propose an Over-and-Under Complete Convolu?tional Recurrent Neural Network (OUCR), which consists of an overcomplete and an undercomplete Convolutional Recurrent Neural Network(CRNN)
The proposed method achieves significant improvements over the compressed sensing and popular deep learning-based methods with less number of trainable parameters.
arXiv Detail & Related papers (2021-06-16T15:56:34Z) - Factorized Neural Processes for Neural Processes: $K$-Shot Prediction of
Neural Responses [9.792408261365043]
We develop a Factorized Neural Process to infer a neuron's tuning function from a small set of stimulus-response pairs.
We show on simulated responses that the predictions and reconstructed receptive fields from the Neural Process approach ground truth with increasing number of trials.
We believe this novel deep learning systems identification framework will facilitate better real-time integration of artificial neural network modeling into neuroscience experiments.
arXiv Detail & Related papers (2020-10-22T15:43:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.