Focal Attention Networks: optimising attention for biomedical image
segmentation
- URL: http://arxiv.org/abs/2111.00534v1
- Date: Sun, 31 Oct 2021 16:20:22 GMT
- Title: Focal Attention Networks: optimising attention for biomedical image
segmentation
- Authors: Michael Yeung, Leonardo Rundo, Evis Sala, Carola-Bibiane Sch\"onlieb,
Guang Yang
- Abstract summary: We investigate the role of the Focal parameter in modulating attention, revealing a link between attention in loss functions and networks.
We achieve optimal performance with fewer number of attention modules on three well-validated biomedical imaging datasets.
- Score: 2.5243042477020836
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, there has been increasing interest to incorporate attention
into deep learning architectures for biomedical image segmentation. The modular
design of attention mechanisms enables flexible integration into convolutional
neural network architectures, such as the U-Net. Whether attention is
appropriate to use, what type of attention to use, and where in the network to
incorporate attention modules, are all important considerations that are
currently overlooked. In this paper, we investigate the role of the Focal
parameter in modulating attention, revealing a link between attention in loss
functions and networks. By incorporating a Focal distance penalty term, we
extend the Unified Focal loss framework to include boundary-based losses.
Furthermore, we develop a simple and interpretable, dataset and model-specific
heuristic to integrate the Focal parameter into the Squeeze-and-Excitation
block and Attention Gate, achieving optimal performance with fewer number of
attention modules on three well-validated biomedical imaging datasets,
suggesting judicious use of attention modules results in better performance and
efficiency.
Related papers
- A Novel Approach to Chest X-ray Lung Segmentation Using U-net and Modified Convolutional Block Attention Module [0.46040036610482665]
This paper presents a novel approach for lung segmentation in chest X-ray images by integrating U-net with attention mechanisms.
The proposed method enhances the U-net architecture by incorporating a Convolutional Block Attention Module (CBAM)
The adoption of the CBAM in conjunction with the U-net architecture marks a significant advancement in the field of medical imaging.
arXiv Detail & Related papers (2024-04-22T16:33:06Z) - Holistic Prototype Attention Network for Few-Shot VOS [74.25124421163542]
Few-shot video object segmentation (FSVOS) aims to segment dynamic objects of unseen classes by resorting to a small set of support images.
We propose a holistic prototype attention network (HPAN) for advancing FSVOS.
arXiv Detail & Related papers (2023-07-16T03:48:57Z) - CAT: Learning to Collaborate Channel and Spatial Attention from
Multi-Information Fusion [23.72040577828098]
We propose a plug-and-play attention module, which we term "CAT"-activating the Collaboration between spatial and channel Attentions.
Specifically, we represent traits as trainable coefficients (i.e., colla-factors) to adaptively combine contributions of different attention modules.
Our CAT outperforms existing state-of-the-art attention mechanisms in object detection, instance segmentation, and image classification.
arXiv Detail & Related papers (2022-12-13T02:34:10Z) - Feedback Chain Network For Hippocampus Segmentation [59.74305660815117]
We propose a novel hierarchical feedback chain network for the hippocampus segmentation task.
The proposed approach achieves state-of-the-art performance on three publicly available datasets.
arXiv Detail & Related papers (2022-11-15T04:32:10Z) - Learning to ignore: rethinking attention in CNNs [87.01305532842878]
We propose to reformulate the attention mechanism in CNNs to learn to ignore instead of learning to attend.
Specifically, we propose to explicitly learn irrelevant information in the scene and suppress it in the produced representation.
arXiv Detail & Related papers (2021-11-10T13:47:37Z) - Bayesian Attention Belief Networks [59.183311769616466]
Attention-based neural networks have achieved state-of-the-art results on a wide range of tasks.
This paper introduces Bayesian attention belief networks, which construct a decoder network by modeling unnormalized attention weights.
We show that our method outperforms deterministic attention and state-of-the-art attention in accuracy, uncertainty estimation, generalization across domains, and adversarial attacks.
arXiv Detail & Related papers (2021-06-09T17:46:22Z) - Coordinate Attention for Efficient Mobile Network Design [96.40415345942186]
We propose a novel attention mechanism for mobile networks by embedding positional information into channel attention.
Unlike channel attention that transforms a feature tensor to a single feature vector via 2D global pooling, the coordinate attention factorizes channel attention into two 1D feature encoding processes.
Our coordinate attention is beneficial to ImageNet classification and behaves better in down-stream tasks, such as object detection and semantic segmentation.
arXiv Detail & Related papers (2021-03-04T09:18:02Z) - Multi-stage Attention ResU-Net for Semantic Segmentation of
Fine-Resolution Remote Sensing Images [9.398340832493457]
We propose a Linear Attention Mechanism (LAM) to address this issue.
LAM is approximately equivalent to dot-product attention with computational efficiency.
We design a Multi-stage Attention ResU-Net for semantic segmentation from fine-resolution remote sensing images.
arXiv Detail & Related papers (2020-11-29T07:24:21Z) - Efficient Attention Network: Accelerate Attention by Searching Where to
Plug [11.616720452770322]
We propose a framework called Efficient Attention Network (EAN) to improve the efficiency for the existing attention modules.
In EAN, we leverage the sharing mechanism to share the attention module within the backbone and search where to connect the shared attention module via reinforcement learning.
Experiments on widely-used benchmarks and popular attention networks show the effectiveness of EAN.
arXiv Detail & Related papers (2020-11-28T03:31:08Z) - Deep Reinforced Attention Learning for Quality-Aware Visual Recognition [73.15276998621582]
We build upon the weakly-supervised generation mechanism of intermediate attention maps in any convolutional neural networks.
We introduce a meta critic network to evaluate the quality of attention maps in the main network.
arXiv Detail & Related papers (2020-07-13T02:44:38Z) - Hybrid Multiple Attention Network for Semantic Segmentation in Aerial
Images [24.35779077001839]
We propose a novel attention-based framework named Hybrid Multiple Attention Network (HMANet) to adaptively capture global correlations.
We introduce a simple yet effective region shuffle attention (RSA) module to reduce feature redundant and improve the efficiency of self-attention mechanism.
arXiv Detail & Related papers (2020-01-09T07:47:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.