Sinusoidal Sensitivity Calculation for Line Segment Geometries
- URL: http://arxiv.org/abs/2208.03059v1
- Date: Fri, 5 Aug 2022 09:30:55 GMT
- Title: Sinusoidal Sensitivity Calculation for Line Segment Geometries
- Authors: Luciano Vinas and Atchar Sudyadhom
- Abstract summary: This paper presents a closed-form solution to the sinusoidal coil sensitivity model proposed by Kern et al.
It allows for precise computations of varied, simulated bias fields for ground-truth debias datasets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Purpose: Provide a closed-form solution to the sinusoidal coil sensitivity
model proposed by Kern et al. This closed-form allows for precise computations
of varied, simulated bias fields for ground-truth debias datasets.
Methods: Fourier distribution theory and standard integration techniques were
used to calculate the Fourier transform for line segment magnetic fields.
Results: A $L^1_{\rm loc}(\mathbb{R}^3)$ function is derived in full
generality for arbitrary line segment geometries. Sampling criteria and
equivalence to the original sinusoidal model are also discussed. Lastly a CUDA
accelerated implementation $\texttt{biasgen}$ is provided by authors.
Conclusion: As the derived result is influenced by coil positioning and
geometry, practitioners will have access to a more diverse ecosystem of
simulated datasets which may be used to compare prospective debiasing methods.
Related papers
- Extracting Manifold Information from Point Clouds [0.0]
A kernel based method is proposed for the construction of signature functions of subsets of $mathbbRd$.
The analytical and analysis of point clouds are the main application.
arXiv Detail & Related papers (2024-03-30T17:21:07Z) - Sketching the Heat Kernel: Using Gaussian Processes to Embed Data [4.220336689294244]
We introduce a novel, non-deterministic method for embedding data in low-dimensional Euclidean space based on realizations of a Gaussian process depending on the geometry of the data.
Our method demonstrates further advantage in its robustness to outliers.
arXiv Detail & Related papers (2024-03-01T22:56:19Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Test Set Sizing Via Random Matrix Theory [91.3755431537592]
This paper uses techniques from Random Matrix Theory to find the ideal training-testing data split for a simple linear regression.
It defines "ideal" as satisfying the integrity metric, i.e. the empirical model error is the actual measurement noise.
This paper is the first to solve for the training and test size for any model in a way that is truly optimal.
arXiv Detail & Related papers (2021-12-11T13:18:33Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Machine Learning and Variational Algorithms for Lattice Field Theory [1.198562319289569]
In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
arXiv Detail & Related papers (2021-06-03T16:37:05Z) - Graph Based Gaussian Processes on Restricted Domains [13.416168979487118]
In nonparametric regression, it is common for the inputs to fall in a restricted subset of Euclidean space.
We propose a new class of Graph Laplacian based GPs (GL-GPs) which learn a covariance that respects the geometry of the input domain.
We provide substantial theoretical support for the GL-GP methodology, and illustrate performance gains in various applications.
arXiv Detail & Related papers (2020-10-14T17:01:29Z) - Linear-time inference for Gaussian Processes on one dimension [17.77516394591124]
We investigate data sampled on one dimension for which state-space models are popular due to their linearly-scaling computational costs.
We provide the first general proof of conjecture that state-space models are general, able to approximate any one-dimensional Gaussian Processes.
We develop parallelized algorithms for performing inference and learning in the LEG model, test the algorithm on real and synthetic data, and demonstrate scaling to datasets with billions of samples.
arXiv Detail & Related papers (2020-03-11T23:20:13Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.