Parameter-Free Bio-Inspired Channel Attention for Enhanced Cardiac MRI Reconstruction
- URL: http://arxiv.org/abs/2505.23872v1
- Date: Thu, 29 May 2025 12:03:24 GMT
- Title: Parameter-Free Bio-Inspired Channel Attention for Enhanced Cardiac MRI Reconstruction
- Authors: Anam Hashmi, Julia Dietlmeier, Kathleen M. Curran, Noel E. O'Connor,
- Abstract summary: We propose a non-linear attention architecture for cardiac MRI reconstruction and hypothesize that insights from ecological principles can guide the development of effective attention mechanisms.<n>Specifically, we investigate a non-linear ecological difference equation that describes single-species population growth to devise a parameter-free attention module.
- Score: 8.904269561863103
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Attention is a fundamental component of the human visual recognition system. The inclusion of attention in a convolutional neural network amplifies relevant visual features and suppresses the less important ones. Integrating attention mechanisms into convolutional neural networks enhances model performance and interpretability. Spatial and channel attention mechanisms have shown significant advantages across many downstream tasks in medical imaging. While existing attention modules have proven to be effective, their design often lacks a robust theoretical underpinning. In this study, we address this gap by proposing a non-linear attention architecture for cardiac MRI reconstruction and hypothesize that insights from ecological principles can guide the development of effective and efficient attention mechanisms. Specifically, we investigate a non-linear ecological difference equation that describes single-species population growth to devise a parameter-free attention module surpassing current state-of-the-art parameter-free methods.
Related papers
- Spiking Meets Attention: Efficient Remote Sensing Image Super-Resolution with Attention Spiking Neural Networks [57.17129753411926]
Spiking neural networks (SNNs) are emerging as a promising alternative to traditional artificial neural networks (ANNs)<n>We propose SpikeSR, which achieves state-of-the-art performance across various remote sensing benchmarks such as AID, DOTA, and DIOR.
arXiv Detail & Related papers (2025-03-06T09:06:06Z) - Self-Attention-Based Contextual Modulation Improves Neural System Identification [2.784365807133169]
Cortical neurons in the primary visual cortex are sensitive to contextual information mediated by horizontal and feedback connections.<n>CNNs integrate global contextual information to model contextual modulation via two mechanisms: successive convolutions and a fully connected readout layer.<n>We find that self-attention can improve neural response predictions over parameter-matched CNNs in two key metrics: tuning curve correlation and peak tuning.
arXiv Detail & Related papers (2024-06-12T03:21:06Z) - A Novel Approach to Chest X-ray Lung Segmentation Using U-net and Modified Convolutional Block Attention Module [0.46040036610482665]
This paper presents a novel approach for lung segmentation in chest X-ray images by integrating U-net with attention mechanisms.
The proposed method enhances the U-net architecture by incorporating a Convolutional Block Attention Module (CBAM)
The adoption of the CBAM in conjunction with the U-net architecture marks a significant advancement in the field of medical imaging.
arXiv Detail & Related papers (2024-04-22T16:33:06Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - Spatial Temporal Graph Convolution with Graph Structure Self-learning
for Early MCI Detection [9.11430195887347]
We propose a spatial temporal graph convolutional network with a novel graph structure self-learning mechanism for EMCI detection.
Results on the Alzheimer's Disease Neuroimaging Initiative database show that our method outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-11T12:29:00Z) - A Generic Shared Attention Mechanism for Various Backbone Neural Networks [53.36677373145012]
Self-attention modules (SAMs) produce strongly correlated attention maps across different layers.
Dense-and-Implicit Attention (DIA) shares SAMs across layers and employs a long short-term memory module.
Our simple yet effective DIA can consistently enhance various network backbones.
arXiv Detail & Related papers (2022-10-27T13:24:08Z) - Attention mechanisms for physiological signal deep learning: which
attention should we take? [0.0]
We experimentally analyze four attention mechanisms (e.g., squeeze-and-excitation, non-local, convolutional block attention module, and multi-head self-attention) and three convolutional neural network (CNN) architectures.
We evaluate multiple combinations for performance and convergence of physiological signal deep learning model.
arXiv Detail & Related papers (2022-07-04T07:24:08Z) - Focal Attention Networks: optimising attention for biomedical image
segmentation [2.5243042477020836]
We investigate the role of the Focal parameter in modulating attention, revealing a link between attention in loss functions and networks.
We achieve optimal performance with fewer number of attention modules on three well-validated biomedical imaging datasets.
arXiv Detail & Related papers (2021-10-31T16:20:22Z) - Linear Attention Mechanism: An Efficient Attention for Semantic
Segmentation [2.9488233765621295]
Linear Attention Mechanism is approximate to dot-product attention with much less memory and computational costs.
Experiments conducted on semantic segmentation demonstrated the effectiveness of linear attention mechanism.
arXiv Detail & Related papers (2020-07-29T15:18:46Z) - Deep Reinforced Attention Learning for Quality-Aware Visual Recognition [73.15276998621582]
We build upon the weakly-supervised generation mechanism of intermediate attention maps in any convolutional neural networks.
We introduce a meta critic network to evaluate the quality of attention maps in the main network.
arXiv Detail & Related papers (2020-07-13T02:44:38Z) - Wave Propagation of Visual Stimuli in Focus of Attention [77.4747032928547]
Fast reactions to changes in the surrounding visual environment require efficient attention mechanisms to reallocate computational resources to most relevant locations in the visual field.
We present a biologically-plausible model of focus of attention that exhibits effectiveness and efficiency exhibited by foveated animals.
arXiv Detail & Related papers (2020-06-19T09:33:21Z) - Untangling tradeoffs between recurrence and self-attention in neural
networks [81.30894993852813]
We present a formal analysis of how self-attention affects gradient propagation in recurrent networks.
We prove that it mitigates the problem of vanishing gradients when trying to capture long-term dependencies.
We propose a relevancy screening mechanism that allows for a scalable use of sparse self-attention with recurrence.
arXiv Detail & Related papers (2020-06-16T19:24:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.