Multimodal Model with Text and Drug Embeddings for Adverse Drug Reaction
Classification
- URL: http://arxiv.org/abs/2210.13238v1
- Date: Fri, 21 Oct 2022 11:41:45 GMT
- Title: Multimodal Model with Text and Drug Embeddings for Adverse Drug Reaction
Classification
- Authors: Andrey Sakhovskiy and Elena Tutubalina
- Abstract summary: We introduce a multimodal model with two components. These components are state-of-the-art BERT-based models for language understanding and molecular property prediction.
Experiments show that the molecular information obtained from neural networks is more beneficial for ADE classification than traditional molecular descriptors.
- Score: 9.339007998235378
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we focus on the classification of tweets as sources of
potential signals for adverse drug effects (ADEs) or drug reactions (ADRs).
Following the intuition that text and drug structure representations are
complementary, we introduce a multimodal model with two components. These
components are state-of-the-art BERT-based models for language understanding
and molecular property prediction. Experiments were carried out on multilingual
benchmarks of the Social Media Mining for Health Research and Applications
(#SMM4H) initiative. Our models obtained state-of-the-art results of 0.61 F1
and 0.57 F1 on #SMM4H 2021 Shared Tasks 1a and 2 in English and Russian,
respectively. On the classification of French tweets from SMM4H 2020 Task 1,
our approach pushes the state of the art by an absolute gain of 8% F1. Our
experiments show that the molecular information obtained from neural networks
is more beneficial for ADE classification than traditional molecular
descriptors. The source code for our models is freely available at
https://github.com/Andoree/smm4h_2021_classification.
Related papers
- FARM: Functional Group-Aware Representations for Small Molecules [55.281754551202326]
We introduce Functional Group-Aware Representations for Small Molecules (FARM)
FARM is a foundation model designed to bridge the gap between SMILES, natural language, and molecular graphs.
We rigorously evaluate FARM on the MoleculeNet dataset, where it achieves state-of-the-art performance on 10 out of 12 tasks.
arXiv Detail & Related papers (2024-10-02T23:04:58Z) - Diversifying Knowledge Enhancement of Biomedical Language Models using
Adapter Modules and Knowledge Graphs [54.223394825528665]
We develop an approach that uses lightweight adapter modules to inject structured biomedical knowledge into pre-trained language models.
We use two large KGs, the biomedical knowledge system UMLS and the novel biochemical OntoChem, with two prominent biomedical PLMs, PubMedBERT and BioLinkBERT.
We show that our methodology leads to performance improvements in several instances while keeping requirements in computing power low.
arXiv Detail & Related papers (2023-12-21T14:26:57Z) - Contextualized Medication Information Extraction Using Transformer-based
Deep Learning Architectures [35.65283211002216]
We developed NLP systems for medication mention extraction, event classification (indicating medication changes discussed or not), and context classification.
We explored 6 state-of-the-art pretrained transformer models for the three subtasks, including GatorTron, a large language model pretrained using >90 billion words of text.
Our GatorTron models achieved the best F1-scores of 0.9828 for medication extraction (ranked 3rd), 0.9379 for event classification (ranked 2nd), and the best micro-average accuracy of 0.9126 for context classification.
arXiv Detail & Related papers (2023-03-14T22:22:28Z) - Drug Synergistic Combinations Predictions via Large-Scale Pre-Training
and Graph Structure Learning [82.93806087715507]
Drug combination therapy is a well-established strategy for disease treatment with better effectiveness and less safety degradation.
Deep learning models have emerged as an efficient way to discover synergistic combinations.
Our framework achieves state-of-the-art results in comparison with other deep learning-based methods.
arXiv Detail & Related papers (2023-01-14T15:07:43Z) - Dependency-based Mixture Language Models [53.152011258252315]
We introduce the Dependency-based Mixture Language Models.
In detail, we first train neural language models with a novel dependency modeling objective.
We then formulate the next-token probability by mixing the previous dependency modeling probability distributions with self-attention.
arXiv Detail & Related papers (2022-03-19T06:28:30Z) - Text Mining Drug/Chemical-Protein Interactions using an Ensemble of BERT
and T5 Based Models [3.7462395049372894]
In Track-1 of the BioCreative VII Challenge participants are asked to identify interactions between drugs/chemicals and proteins.
We attempt both a BERT-based sentence classification approach, and a more novel text-to-text approach using a T5 model.
arXiv Detail & Related papers (2021-11-30T18:14:06Z) - R-BERT-CNN: Drug-target interactions extraction from biomedical
literature [1.8814209805277506]
We present our participation for the DrugProt task BioCreative VII challenge.
Drug-target interactions (DTIs) are critical for drug discovery and repurposing.
There are >32M biomedical articles on PubMed and manually extracting DTIs from such a huge knowledge base is challenging.
arXiv Detail & Related papers (2021-10-31T22:50:33Z) - Neural networks for Anatomical Therapeutic Chemical (ATC) [83.73971067918333]
We propose combining multiple multi-label classifiers trained on distinct sets of features, including sets extracted from a Bidirectional Long Short-Term Memory Network (BiLSTM)
Experiments demonstrate the power of this approach, which is shown to outperform the best methods reported in the literature.
arXiv Detail & Related papers (2021-01-22T19:49:47Z) - MolTrans: Molecular Interaction Transformer for Drug Target Interaction
Prediction [68.5766865583049]
Drug target interaction (DTI) prediction is a foundational task for in silico drug discovery.
Recent years have witnessed promising progress for deep learning in DTI predictions.
We propose a Molecular Interaction Transformer (TransMol) to address these limitations.
arXiv Detail & Related papers (2020-04-23T18:56:04Z) - The Russian Drug Reaction Corpus and Neural Models for Drug Reactions
and Effectiveness Detection in User Reviews [13.428173157465062]
The Russian Drug Reaction Corpus (RuDReC) is a new partially annotated corpus of consumer reviews in Russian about pharmaceutical products.
The raw part includes 1.4 million health-related user-generated texts collected from various Internet sources.
The labelled part contains 500 consumer reviews about drug therapy with drug- and disease-related information.
arXiv Detail & Related papers (2020-04-07T19:26:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.