Global Context Is All You Need for Parallel Efficient Tractography Parcellation
- URL: http://arxiv.org/abs/2503.07104v1
- Date: Mon, 10 Mar 2025 09:27:07 GMT
- Title: Global Context Is All You Need for Parallel Efficient Tractography Parcellation
- Authors: Valentin von Bornhaupt, Johannes Grün, and Justus Bisten, Tobias Bauer, Theodor Rüber, Thomas Schultz,
- Abstract summary: We propose PETParc, a new method for Parallel Efficient Tractography Parcellation.<n>PETParc is a transformer-based architecture in which the whole-brain tractogram is randomly partitioned into sub-tractograms.<n>Results are often even better than those of prior methods.
- Score: 0.6596280437011043
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Whole-brain tractography in diffusion MRI is often followed by a parcellation in which each streamline is classified as belonging to a specific white matter bundle, or discarded as a false positive. Efficient parcellation is important both in large-scale studies, which have to process huge amounts of data, and in the clinic, where computational resources are often limited. TractCloud is a state-of-the-art approach that aims to maximize accuracy with a local-global representation. We demonstrate that the local context does not contribute to the accuracy of that approach, and is even detrimental when dealing with pathological cases. Based on this observation, we propose PETParc, a new method for Parallel Efficient Tractography Parcellation. PETParc is a transformer-based architecture in which the whole-brain tractogram is randomly partitioned into sub-tractograms whose streamlines are classified in parallel, while serving as global context for each other. This leads to a speedup of up to two orders of magnitude relative to TractCloud, and permits inference even on clinical workstations without a GPU. PETParc accounts for the lack of streamline orientation either via a novel flip-invariant embedding, or by simply using flips as part of data augmentation. Despite the speedup, results are often even better than those of prior methods. The code and pretrained model will be made public upon acceptance.
Related papers
- Find A Winning Sign: Sign Is All We Need to Win the Lottery [52.63674911541416]
We show that a sparse network trained by an existing IP method can retain its basin of attraction if its parameter signs and normalization layer parameters are preserved.
To take a step closer to finding a winning ticket, we alleviate the reliance on normalization layer parameters by preventing high error barriers along the linear path between the sparse network trained by our method and its counterpart with normalization layer parameters.
arXiv Detail & Related papers (2025-04-07T09:30:38Z) - SpaRG: Sparsely Reconstructed Graphs for Generalizable fMRI Analysis [8.489318619991534]
Deep learning can help uncover patterns in resting-state functional Magnetic Resonance Imaging (rsfMRI) associated with psychiatric disorders and personal traits.
Yet the problem of interpreting deep learning findings is rarely more evident than in fMRI analyses.
We propose a simple approach to mitigate these challenges grounded on sparsification and self-supervision.
arXiv Detail & Related papers (2024-09-24T18:35:57Z) - Deep Homography Estimation for Visual Place Recognition [49.235432979736395]
We propose a transformer-based deep homography estimation (DHE) network.
It takes the dense feature map extracted by a backbone network as input and fits homography for fast and learnable geometric verification.
Experiments on benchmark datasets show that our method can outperform several state-of-the-art methods.
arXiv Detail & Related papers (2024-02-25T13:22:17Z) - TractCloud: Registration-free tractography parcellation with a novel
local-global streamline point cloud representation [63.842881844791094]
Current tractography parcellation methods rely heavily on registration, but registration inaccuracies can affect parcellation.
We propose TractCloud, a registration-free framework that performs whole-brain tractography parcellation directly in individual subject space.
arXiv Detail & Related papers (2023-07-18T06:35:12Z) - FedSpeed: Larger Local Interval, Less Communication Round, and Higher
Generalization Accuracy [84.45004766136663]
Federated learning is an emerging distributed machine learning framework.
It suffers from the non-vanishing biases introduced by the local inconsistent optimal and the rugged client-drifts by the local over-fitting.
We propose a novel and practical method, FedSpeed, to alleviate the negative impacts posed by these problems.
arXiv Detail & Related papers (2023-02-21T03:55:29Z) - Assessing Streamline Plausibility Through Randomized Iterative
Spherical-Deconvolution Informed Tractogram Filtering [0.0]
Tractography has become an indispensable part of brain connectivity studies.
Streamlines in tractograms produced by state-of-the-art tractography methods are anatomically implausible.
This study takes a closer look at one such method, textitSpherical-decon Informed Filtering of Tractograms (SIFT)
We propose applying SIFT to randomly selected tractogram subsets in order to retrieve multiple assessments for each streamline.
arXiv Detail & Related papers (2022-05-10T12:36:30Z) - MD-CSDNetwork: Multi-Domain Cross Stitched Network for Deepfake
Detection [80.83725644958633]
Current deepfake generation methods leave discriminative artifacts in the frequency spectrum of fake images and videos.
We present a novel approach, termed as MD-CSDNetwork, for combining the features in the spatial and frequency domains to mine a shared discriminative representation.
arXiv Detail & Related papers (2021-09-15T14:11:53Z) - KL Guided Domain Adaptation [88.19298405363452]
Domain adaptation is an important problem and often needed for real-world applications.
A common approach in the domain adaptation literature is to learn a representation of the input that has the same distributions over the source and the target domain.
We show that with a probabilistic representation network, the KL term can be estimated efficiently via minibatch samples.
arXiv Detail & Related papers (2021-06-14T22:24:23Z) - Locally induced Gaussian processes for large-scale simulation
experiments [0.0]
We show how placement of inducing points and their multitude can be thwarted by pathologies.
Our proposed methodology hybridizes global inducing point and data subset-based local GP approximation.
We show that local inducing points extend their global and data-subset component parts on the accuracy--computational efficiency frontier.
arXiv Detail & Related papers (2020-08-28T21:37:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.