Lensless multicore-fiber microendoscope for real-time tailored light
field generation with phase encoder neural network (CoreNet)
- URL: http://arxiv.org/abs/2111.12758v1
- Date: Wed, 24 Nov 2021 19:37:32 GMT
- Title: Lensless multicore-fiber microendoscope for real-time tailored light
field generation with phase encoder neural network (CoreNet)
- Authors: Jiawei Sun, Jiachen Wu, Nektarios Koukourakis, Robert Kuschmierz,
Liangcai Cao and Juergen Czarske
- Abstract summary: A novel phase deep neural network (CoreNet) can generate accurate tailored CGHs for MCF encoders at a near video-rate.
CoreNet can speed up the computation time by two magnitudes and increase the fidelity of the generated light field.
This paves the avenue for real-time cell rotation and several further applications that require real-time high-fidelity light delivery in biomedicine.
- Score: 0.5505013339790825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The generation of tailored light with multi-core fiber (MCF) lensless
microendoscopes is widely used in biomedicine. However, the computer-generated
holograms (CGHs) used for such applications are typically generated by
iterative algorithms, which demand high computation effort, limiting advanced
applications like in vivo optogenetic stimulation and fiber-optic cell
manipulation. The random and discrete distribution of the fiber cores induces
strong spatial aliasing to the CGHs, hence, an approach that can rapidly
generate tailored CGHs for MCFs is highly demanded. We demonstrate a novel
phase encoder deep neural network (CoreNet), which can generate accurate
tailored CGHs for MCFs at a near video-rate. Simulations show that CoreNet can
speed up the computation time by two magnitudes and increase the fidelity of
the generated light field compared to the conventional CGH techniques. For the
first time, real-time generated tailored CGHs are on-the-fly loaded to the
phase-only SLM for dynamic light fields generation through the MCF
microendoscope in experiments. This paves the avenue for real-time cell
rotation and several further applications that require real-time high-fidelity
light delivery in biomedicine.
Related papers
- Hybrid Spiking Neural Networks for Low-Power Intra-Cortical Brain-Machine Interfaces [42.72938925647165]
Intra-cortical brain-machine interfaces (iBMIs) have the potential to dramatically improve the lives of people with paraplegia.
Current iBMIs suffer from scalability and mobility limitations due to bulky hardware and wiring.
We are investigating hybrid spiking neural networks for embedded neural decoding in wireless iBMIs.
arXiv Detail & Related papers (2024-09-06T17:48:44Z) - Prototype Learning Guided Hybrid Network for Breast Tumor Segmentation in DCE-MRI [58.809276442508256]
We propose a hybrid network via the combination of convolution neural network (CNN) and transformer layers.
The experimental results on private and public DCE-MRI datasets demonstrate that the proposed hybrid network superior performance than the state-of-the-art methods.
arXiv Detail & Related papers (2024-08-11T15:46:00Z) - Cross-Scan Mamba with Masked Training for Robust Spectral Imaging [51.557804095896174]
We propose the Cross-Scanning Mamba, named CS-Mamba, that employs a Spatial-Spectral SSM for global-local balanced context encoding.
Experiment results show that our CS-Mamba achieves state-of-the-art performance and the masked training method can better reconstruct smooth features to improve the visual quality.
arXiv Detail & Related papers (2024-08-01T15:14:10Z) - Self-STORM: Deep Unrolled Self-Supervised Learning for Super-Resolution Microscopy [55.2480439325792]
We introduce deep unrolled self-supervised learning, which alleviates the need for such data by training a sequence-specific, model-based autoencoder.
Our proposed method exceeds the performance of its supervised counterparts.
arXiv Detail & Related papers (2024-03-25T17:40:32Z) - Calibration-free quantitative phase imaging in multi-core fiber
endoscopes using end-to-end deep learning [49.013721992323994]
We demonstrate a learning-based MCF phase imaging method, that significantly reduced the phase reconstruction time to 5.5 ms.
We also introduce an innovative optical system that automatically generated the first open-source dataset tailored for MCF phase imaging.
Our trained deep neural network (DNN) demonstrates robust phase reconstruction performance in experiments with a mean fidelity of up to 99.8%.
arXiv Detail & Related papers (2023-12-12T09:30:12Z) - Theoretical framework for real time sub-micron depth monitoring using
quantum inline coherent imaging [55.2480439325792]
Inline Coherent Imaging (ICI) is a reliable method for real-time monitoring of various laser processes, including keyhole welding, additive manufacturing, and micromachining.
The axial resolution is limited to greater than 2 mum making ICI unsuitable for monitoring submicron processes.
Advancements in Quantum Optical Coherence Tomography (Q OCT) has the potential to address this issue by achieving better than 1 mum depth resolution.
arXiv Detail & Related papers (2023-09-17T17:05:21Z) - NAF: Neural Attenuation Fields for Sparse-View CBCT Reconstruction [79.13750275141139]
This paper proposes a novel and fast self-supervised solution for sparse-view CBCT reconstruction.
The desired attenuation coefficients are represented as a continuous function of 3D spatial coordinates, parameterized by a fully-connected deep neural network.
A learning-based encoder entailing hash coding is adopted to help the network capture high-frequency details.
arXiv Detail & Related papers (2022-09-29T04:06:00Z) - LWGNet: Learned Wirtinger Gradients for Fourier Ptychographic Phase
Retrieval [14.588976801396576]
We propose a hybrid model-driven residual network that combines the knowledge of the forward imaging system with a deep data-driven network.
Unlike other conventional unrolling techniques, LWGNet uses fewer stages while performing at par or even better than existing traditional and deep learning techniques.
This improvement in performance for low-bit depth and low-cost sensors has the potential to bring down the cost of FPM imaging setup significantly.
arXiv Detail & Related papers (2022-08-08T17:22:54Z) - RF-Photonic Deep Learning Processor with Shannon-Limited Data Movement [0.0]
Optical neural networks (ONNs) are promising accelerators with ultra-low latency and energy consumption.
We introduce our multiplicative analog frequency transform ONN (MAFT-ONN) that encodes the data in the frequency domain.
We experimentally demonstrate the first hardware accelerator that computes fully-analog deep learning on raw RF signals.
arXiv Detail & Related papers (2022-07-08T16:37:13Z) - Three-dimensional microstructure generation using generative adversarial
neural networks in the context of continuum micromechanics [77.34726150561087]
This work proposes a generative adversarial network tailored towards three-dimensional microstructure generation.
The lightweight algorithm is able to learn the underlying properties of the material from a single microCT-scan without the need of explicit descriptors.
arXiv Detail & Related papers (2022-05-31T13:26:51Z) - Sparse deep computer-generated holography for optical microscopy [2.578242050187029]
Computer-generated holography (CGH) has broad applications such as direct-view display, virtual and augmented reality, as well as optical microscopy.
We propose a CGH algorithm using an unsupervised generative model designed for optical microscopy to synthesize 3D selected illumination.
The algorithm, named sparse deep CGH, is able to generate sparsely distributed points in a large 3D volume with higher contrast than conventional CGH algorithms.
arXiv Detail & Related papers (2021-11-30T07:34:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.