Imaging Conductivity from Current Density Magnitude using Neural
Networks
- URL: http://arxiv.org/abs/2204.02441v1
- Date: Tue, 5 Apr 2022 18:31:03 GMT
- Title: Imaging Conductivity from Current Density Magnitude using Neural
Networks
- Authors: Bangti Jin and Xiyao Li and Xiliang Lu
- Abstract summary: We develop a neural network based reconstruction technique for imaging the conductivity from the magnitude of the internal current density.
It is observed that the approach enjoys remarkable robustness with respect to the presence of data noise.
- Score: 1.8692254863855962
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conductivity imaging represents one of the most important tasks in medical
imaging. In this work we develop a neural network based reconstruction
technique for imaging the conductivity from the magnitude of the internal
current density. It is achieved by formulating the problem as a relaxed
weighted least-gradient problem, and then approximating its minimizer by
standard fully connected feedforward neural networks. We derive bounds on two
components of the generalization error, i.e., approximation error and
statistical error, explicitly in terms of properties of the neural networks
(e.g., depth, total number of parameters, and the bound of the network
parameters). We illustrate the performance and distinct features of the
approach on several numerical experiments. Numerically, it is observed that the
approach enjoys remarkable robustness with respect to the presence of data
noise.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - Measure theoretic results for approximation by neural networks with
limited weights [0.0]
We study approximation properties of single hidden layer neural networks with weights varying on finitely many directions and thresholds from an open interval.
We obtain a necessary and at the same time sufficient measure theoretic condition for density of such networks in the space of continuous functions.
arXiv Detail & Related papers (2023-04-04T15:34:53Z) - Conductivity Imaging from Internal Measurements with Mixed Least-Squares
Deep Neural Networks [4.228167013618626]
We develop a novel approach using deep neural networks to reconstruct the conductivity distribution in elliptic problems.
We provide a thorough analysis of the deep neural network approximations of the conductivity for both continuous and empirical losses.
arXiv Detail & Related papers (2023-03-29T04:43:03Z) - Globally Optimal Training of Neural Networks with Threshold Activation
Functions [63.03759813952481]
We study weight decay regularized training problems of deep neural networks with threshold activations.
We derive a simplified convex optimization formulation when the dataset can be shattered at a certain layer of the network.
arXiv Detail & Related papers (2023-03-06T18:59:13Z) - Anomaly Detection in Image Datasets Using Convolutional Neural Networks,
Center Loss, and Mahalanobis Distance [0.0]
User activities generate a significant number of poor-quality or irrelevant images and data vectors.
For neural networks, the anomalous is usually defined as out-of-distribution samples.
This work proposes methods for supervised and semi-supervised detection of out-of-distribution samples in image datasets.
arXiv Detail & Related papers (2021-04-13T13:44:03Z) - Topological obstructions in neural networks learning [67.8848058842671]
We study global properties of the loss gradient function flow.
We use topological data analysis of the loss function and its Morse complex to relate local behavior along gradient trajectories with global properties of the loss surface.
arXiv Detail & Related papers (2020-12-31T18:53:25Z) - The Representation Power of Neural Networks: Breaking the Curse of
Dimensionality [0.0]
We prove upper bounds on quantities for shallow and deep neural networks.
We further prove that these bounds nearly match the minimal number of parameters any continuous function approximator needs to approximate Korobov functions.
arXiv Detail & Related papers (2020-12-10T04:44:07Z) - Compressive sensing with un-trained neural networks: Gradient descent
finds the smoothest approximation [60.80172153614544]
Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration.
We show that an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.
arXiv Detail & Related papers (2020-05-07T15:57:25Z) - Beyond Dropout: Feature Map Distortion to Regularize Deep Neural
Networks [107.77595511218429]
In this paper, we investigate the empirical Rademacher complexity related to intermediate layers of deep neural networks.
We propose a feature distortion method (Disout) for addressing the aforementioned problem.
The superiority of the proposed feature map distortion for producing deep neural network with higher testing performance is analyzed and demonstrated.
arXiv Detail & Related papers (2020-02-23T13:59:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.