Transformers in Healthcare: A Survey
- URL: http://arxiv.org/abs/2307.00067v1
- Date: Fri, 30 Jun 2023 18:14:20 GMT
- Title: Transformers in Healthcare: A Survey
- Authors: Subhash Nerella, Sabyasachi Bandyopadhyay, Jiaqing Zhang, Miguel
Contreras, Scott Siegel, Aysegul Bumin, Brandon Silva, Jessica Sena, Benjamin
Shickel, Azra Bihorac, Kia Khezeli, Parisa Rashidi
- Abstract summary: Transformer is a type of deep learning architecture initially developed to solve general-purpose Natural Language Processing (NLP) tasks.
We provide an overview of how this architecture has been adopted to analyze various forms of data, including medical imaging, structured and unstructured Electronic Health Records (EHR), social media, physiological signals, and biomolecular sequences.
We discuss the benefits and limitations of using transformers in healthcare and examine issues such as computational cost, model interpretability, fairness, alignment with human values, ethical implications, and environmental impact.
- Score: 11.189892739475633
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With Artificial Intelligence (AI) increasingly permeating various aspects of
society, including healthcare, the adoption of the Transformers neural network
architecture is rapidly changing many applications. Transformer is a type of
deep learning architecture initially developed to solve general-purpose Natural
Language Processing (NLP) tasks and has subsequently been adapted in many
fields, including healthcare. In this survey paper, we provide an overview of
how this architecture has been adopted to analyze various forms of data,
including medical imaging, structured and unstructured Electronic Health
Records (EHR), social media, physiological signals, and biomolecular sequences.
Those models could help in clinical diagnosis, report generation, data
reconstruction, and drug/protein synthesis. We identified relevant studies
using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses
(PRISMA) guidelines. We also discuss the benefits and limitations of using
transformers in healthcare and examine issues such as computational cost, model
interpretability, fairness, alignment with human values, ethical implications,
and environmental impact.
Related papers
- Zero Shot Health Trajectory Prediction Using Transformer [11.660997334071952]
Enhanced Transformer for Health Outcome Simulation (ETHOS) is a novel application of the transformer deep-learning architecture for analyzing health data.
ETHOS is trained using Patient Health Timelines (PHTs)-detailed, tokenized records of health events-to predict future health trajectories.
arXiv Detail & Related papers (2024-07-30T18:33:05Z) - Transformers-based architectures for stroke segmentation: A review [0.6554326244334866]
Stroke remains a significant global health concern, necessitating precise and efficient diagnostic tools for timely intervention and improved patient outcomes.
Transformers, initially designed for natural language processing, have exhibited remarkable capabilities in various computer vision applications, including medical image analysis.
This review aims to provide an in-depth exploration of the cutting-edge Transformer-based architectures applied in the context of stroke segmentation.
arXiv Detail & Related papers (2024-03-27T14:42:08Z) - Improved EATFormer: A Vision Transformer for Medical Image Classification [0.0]
This paper presents an improved Algorithm-based Transformer architecture for medical image classification using Vision Transformers.
The proposed EATFormer architecture combines the strengths of Convolutional Neural Networks and Vision Transformers.
Experimental results on the Chest X-ray and Kvasir datasets demonstrate that the proposed EATFormer significantly improves prediction speed and accuracy compared to baseline models.
arXiv Detail & Related papers (2024-03-19T21:40:20Z) - Diversifying Knowledge Enhancement of Biomedical Language Models using
Adapter Modules and Knowledge Graphs [54.223394825528665]
We develop an approach that uses lightweight adapter modules to inject structured biomedical knowledge into pre-trained language models.
We use two large KGs, the biomedical knowledge system UMLS and the novel biochemical OntoChem, with two prominent biomedical PLMs, PubMedBERT and BioLinkBERT.
We show that our methodology leads to performance improvements in several instances while keeping requirements in computing power low.
arXiv Detail & Related papers (2023-12-21T14:26:57Z) - A Comprehensive Review of Generative AI in Healthcare [0.0]
generative AI models, specifically transformers and diffusion models, have played a crucial role in analyzing diverse forms of data.
These models have played a crucial role in analyzing diverse forms of data, including medical imaging, protein structure prediction, clinical documentation, diagnostic assistance, radiology interpretation, clinical decision support, medical coding, and billing.
This review paper aims to offer a thorough overview of the generative AI applications in healthcare, focusing on transformers and diffusion models.
arXiv Detail & Related papers (2023-10-01T21:13:14Z) - A Comprehensive Survey on Applications of Transformers for Deep Learning
Tasks [60.38369406877899]
Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data.
transformer models excel in handling long dependencies between input sequence elements and enable parallel processing.
Our survey encompasses the identification of the top five application domains for transformer-based models.
arXiv Detail & Related papers (2023-06-11T23:13:51Z) - Advances in Medical Image Analysis with Vision Transformers: A
Comprehensive Review [6.953789750981636]
We provide an encyclopedic review of the applications of Transformers in medical imaging.
Specifically, we present a systematic and thorough review of relevant recent Transformer literature for different medical image analysis tasks.
arXiv Detail & Related papers (2023-01-09T16:56:23Z) - Transformers in Medical Imaging: A Survey [88.03790310594533]
Transformers have been successfully applied to several computer vision problems, achieving state-of-the-art results.
Medical imaging has also witnessed growing interest for Transformers that can capture global context compared to CNNs with local receptive fields.
We provide a review of the applications of Transformers in medical imaging covering various aspects, ranging from recently proposed architectural designs to unsolved issues.
arXiv Detail & Related papers (2022-01-24T18:50:18Z) - Medical Transformer: Gated Axial-Attention for Medical Image
Segmentation [73.98974074534497]
We study the feasibility of using Transformer-based network architectures for medical image segmentation tasks.
We propose a Gated Axial-Attention model which extends the existing architectures by introducing an additional control mechanism in the self-attention module.
To train the model effectively on medical images, we propose a Local-Global training strategy (LoGo) which further improves the performance.
arXiv Detail & Related papers (2021-02-21T18:35:14Z) - Domain Shift in Computer Vision models for MRI data analysis: An
Overview [64.69150970967524]
Machine learning and computer vision methods are showing good performance in medical imagery analysis.
Yet only a few applications are now in clinical use.
Poor transferability of themodels to data from different sources or acquisition domains is one of the reasons for that.
arXiv Detail & Related papers (2020-10-14T16:34:21Z) - Machine Learning in Nano-Scale Biomedical Engineering [77.75587007080894]
We review the existing research regarding the use of machine learning in nano-scale biomedical engineering.
The main challenges that can be formulated as ML problems are classified into the three main categories.
For each of the presented methodologies, special emphasis is given to its principles, applications, and limitations.
arXiv Detail & Related papers (2020-08-05T15:45:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.