Equivariant Spherical Deconvolution: Learning Sparse Orientation
Distribution Functions from Spherical Data
- URL: http://arxiv.org/abs/2102.09462v1
- Date: Wed, 17 Feb 2021 16:04:35 GMT
- Title: Equivariant Spherical Deconvolution: Learning Sparse Orientation
Distribution Functions from Spherical Data
- Authors: Axel Elaldi, Neel Dey, Heejong Kim, Guido Gerig
- Abstract summary: We present a rotation-equivariant unsupervised learning framework for the sparse deconvolution of non-negative scalar fields defined on the unit sphere.
We show improvements in terms of tractography and partial volume estimation on a multi-shell dataset of human subjects.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a rotation-equivariant unsupervised learning framework for the
sparse deconvolution of non-negative scalar fields defined on the unit sphere.
Spherical signals with multiple peaks naturally arise in Diffusion MRI (dMRI),
where each voxel consists of one or more signal sources corresponding to
anisotropic tissue structure such as white matter. Due to spatial and spectral
partial voluming, clinically-feasible dMRI struggles to resolve crossing-fiber
white matter configurations, leading to extensive development in spherical
deconvolution methodology to recover underlying fiber directions. However,
these methods are typically linear and struggle with small crossing-angles and
partial volume fraction estimation. In this work, we improve on current
methodologies by nonlinearly estimating fiber structures via unsupervised
spherical convolutional networks with guaranteed equivariance to spherical
rotation. Experimentally, we first validate our proposition via extensive
single and multi-shell synthetic benchmarks demonstrating competitive
performance against common baselines. We then show improved downstream
performance on fiber tractography measures on the Tractometer benchmark
dataset. Finally, we show downstream improvements in terms of tractography and
partial volume estimation on a multi-shell dataset of human subjects.
Related papers
- Learning Spatially-Continuous Fiber Orientation Functions [1.4504054468850665]
We propose FENRI, a novel method that learns spatially-continuous fiber orientation density functions from low-resolution diffusion-weighted images.
We demonstrate that FENRI accurately predicts high-resolution fiber orientations from realistic low-quality data.
arXiv Detail & Related papers (2023-12-10T01:28:47Z) - Gradient-Based Feature Learning under Structured Data [57.76552698981579]
In the anisotropic setting, the commonly used spherical gradient dynamics may fail to recover the true direction.
We show that appropriate weight normalization that is reminiscent of batch normalization can alleviate this issue.
In particular, under the spiked model with a suitably large spike, the sample complexity of gradient-based training can be made independent of the information exponent.
arXiv Detail & Related papers (2023-09-07T16:55:50Z) - Bundle-specific Tractogram Distribution Estimation Using Higher-order Streamline Differential Equation [15.371246200911651]
We propose a novel tractography method based on a bundle-specific tractogram distribution function.
A unified framework for any higher-order streamline differential equation is presented to describe the fiber bundles.
At the global level, the tractography process is simplified as the estimation of bundle-specific tractogram distribution (BTD) coefficients.
arXiv Detail & Related papers (2023-07-06T07:45:31Z) - $E(3) \times SO(3)$-Equivariant Networks for Spherical Deconvolution in
Diffusion MRI [4.726777092009554]
We present a framework for sparse deconvolution of volumes where each voxel contains a spherical signal.
This work constructs equivariant deep learning layers which respect to symmetries of spatial rotations, reflections, and translations.
arXiv Detail & Related papers (2023-04-12T18:37:32Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Learning Discretized Neural Networks under Ricci Flow [51.36292559262042]
We study Discretized Neural Networks (DNNs) composed of low-precision weights and activations.
DNNs suffer from either infinite or zero gradients due to the non-differentiable discrete function during training.
arXiv Detail & Related papers (2023-02-07T10:51:53Z) - Geometric Scattering on Measure Spaces [12.0756034112778]
We introduce a general, unified model for geometric scattering on measure spaces.
We consider finite measure spaces that are obtained from randomly sampling an unknown manifold.
We propose two methods for constructing a data-driven graph on which the associated graph scattering transform approximates the scattering transform on the underlying manifold.
arXiv Detail & Related papers (2022-08-17T22:40:09Z) - How can spherical CNNs benefit ML-based diffusion MRI parameter
estimation? [2.4417196796959906]
Spherical convolutional neural networks (S-CNN) offer distinct advantages over conventional fully-connected networks (FCN)
Current clinical practice commonly acquires dMRI data consisting of only 6 diffusion weighted images (DWIs)
arXiv Detail & Related papers (2022-07-01T17:49:26Z) - On the Effective Number of Linear Regions in Shallow Univariate ReLU
Networks: Convergence Guarantees and Implicit Bias [50.84569563188485]
We show that gradient flow converges in direction when labels are determined by the sign of a target network with $r$ neurons.
Our result may already hold for mild over- parameterization, where the width is $tildemathcalO(r)$ and independent of the sample size.
arXiv Detail & Related papers (2022-05-18T16:57:10Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.