Understanding the Influence of Receptive Field and Network Complexity in
Neural-Network-Guided TEM Image Analysis
- URL: http://arxiv.org/abs/2204.04250v1
- Date: Fri, 8 Apr 2022 18:45:15 GMT
- Title: Understanding the Influence of Receptive Field and Network Complexity in
Neural-Network-Guided TEM Image Analysis
- Authors: Katherine Sytwu, Catherine Groschner, Mary C. Scott
- Abstract summary: We systematically examine how neural network architecture choices affect how neural networks segment in transmission electron microscopy (TEM) images.
We find that for low-resolution TEM images which rely on amplitude contrast to distinguish nanoparticles from background, the receptive field does not significantly influence segmentation performance.
On the other hand, for high-resolution TEM images which rely on a combination of amplitude and phase contrast changes to identify nanoparticles, receptive field is a key parameter for increased performance.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Trained neural networks are promising tools to analyze the ever-increasing
amount of scientific image data, but it is unclear how to best customize these
networks for the unique features in transmission electron micrographs. Here, we
systematically examine how neural network architecture choices affect how
neural networks segment, or pixel-wise separate, crystalline nanoparticles from
amorphous background in transmission electron microscopy (TEM) images. We focus
on decoupling the influence of receptive field, or the area of the input image
that contributes to the output decision, from network complexity, which
dictates the number of trainable parameters. We find that for low-resolution
TEM images which rely on amplitude contrast to distinguish nanoparticles from
background, the receptive field does not significantly influence segmentation
performance. On the other hand, for high-resolution TEM images which rely on a
combination of amplitude and phase contrast changes to identify nanoparticles,
receptive field is a key parameter for increased performance, especially in
images with minimal amplitude contrast. Our results provide insight and
guidance as to how to adapt neural networks for applications with TEM datasets.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Learning Multimodal Volumetric Features for Large-Scale Neuron Tracing [72.45257414889478]
We aim to reduce human workload by predicting connectivity between over-segmented neuron pieces.
We first construct a dataset, named FlyTracing, that contains millions of pairwise connections of segments expanding the whole fly brain.
We propose a novel connectivity-aware contrastive learning method to generate dense volumetric EM image embedding.
arXiv Detail & Related papers (2024-01-05T19:45:12Z) - Unleashing the Power of Depth and Pose Estimation Neural Networks by
Designing Compatible Endoscopic Images [12.412060445862842]
We conduct a detail analysis of the properties of endoscopic images and improve the compatibility of images and neural networks.
First, we introcude the Mask Image Modelling (MIM) module, which inputs partial image information instead of complete image information.
Second, we propose a lightweight neural network to enhance the endoscopic images, to explicitly improve the compatibility between images and neural networks.
arXiv Detail & Related papers (2023-09-14T02:19:38Z) - Generalization Across Experimental Parameters in Machine Learning
Analysis of High Resolution Transmission Electron Microscopy Datasets [0.0]
We train and validate neural networks across curated, experimentally-collected high-resolution TEM image datasets of nanoparticles.
We find that our neural networks are not robust across microscope parameters, but do generalize across certain sample parameters.
arXiv Detail & Related papers (2023-06-20T19:13:49Z) - Impact of Scaled Image on Robustness of Deep Neural Networks [0.0]
Scaling the raw images creates out-of-distribution data, which makes it a possible adversarial attack to fool the networks.
In this work, we propose a Scaling-distortion dataset ImageNet-CS by Scaling a subset of the ImageNet Challenge dataset by different multiples.
arXiv Detail & Related papers (2022-09-02T08:06:58Z) - All-optical graph representation learning using integrated diffractive
photonic computing units [51.15389025760809]
Photonic neural networks perform brain-inspired computations using photons instead of electrons.
We propose an all-optical graph representation learning architecture, termed diffractive graph neural network (DGNN)
We demonstrate the use of DGNN extracted features for node and graph-level classification tasks with benchmark databases and achieve superior performance.
arXiv Detail & Related papers (2022-04-23T02:29:48Z) - FuNNscope: Visual microscope for interactively exploring the loss
landscape of fully connected neural networks [77.34726150561087]
We show how to explore high-dimensional landscape characteristics of neural networks.
We generalize observations on small neural networks to more complex systems.
An interactive dashboard opens up a number of possible application networks.
arXiv Detail & Related papers (2022-04-09T16:41:53Z) - The Spectral Bias of Polynomial Neural Networks [63.27903166253743]
Polynomial neural networks (PNNs) have been shown to be particularly effective at image generation and face recognition, where high-frequency information is critical.
Previous studies have revealed that neural networks demonstrate a $textitspectral bias$ towards low-frequency functions, which yields faster learning of low-frequency components during training.
Inspired by such studies, we conduct a spectral analysis of the Tangent Kernel (NTK) of PNNs.
We find that the $Pi$-Net family, i.e., a recently proposed parametrization of PNNs, speeds up the
arXiv Detail & Related papers (2022-02-27T23:12:43Z) - Convolutional Neural Network with Convolutional Block Attention Module
for Finger Vein Recognition [4.035753155957698]
We propose a lightweight convolutional neural network with a convolutional block attention module (CBAM) for finger vein recognition.
The experiments are carried out on two publicly available databases and the results demonstrate that the proposed method achieves a stable, highly accurate, and robust performance in multimodal finger recognition.
arXiv Detail & Related papers (2022-02-14T12:59:23Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Correlation between image quality metrics of magnetic resonance images
and the neural network segmentation accuracy [0.0]
In this study, we investigated the correlation between the image quality metrics of MR images with the neural network segmentation accuracy.
The difference in the segmentation accuracy between models based on the random training inputs with IQM based training inputs shed light on the role of image quality metrics on segmentation accuracy.
arXiv Detail & Related papers (2021-11-01T17:02:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.