Knowledge-aware Attention Network for Protein-Protein Interaction
Extraction
- URL: http://arxiv.org/abs/2001.02091v1
- Date: Tue, 7 Jan 2020 15:02:28 GMT
- Title: Knowledge-aware Attention Network for Protein-Protein Interaction
Extraction
- Authors: Huiwei Zhou, Zhuang Liu1, Shixian Ning, Chengkun Lang, Yingyu Lin, Lei
Du
- Abstract summary: This paper proposes a knowledge-aware attention network (KAN) to fuse prior knowledge about protein-protein pairs and context information for PPI extraction.
Experiment results on the BioCreative VI PPI dataset show that the proposed approach could acquire knowledge-aware dependencies between different words in a sequence.
- Score: 0.9949801888214528
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Protein-protein interaction (PPI) extraction from published scientific
literature provides additional support for precision medicine efforts. However,
many of the current PPI extraction methods need extensive feature engineering
and cannot make full use of the prior knowledge in knowledge bases (KB). KBs
contain huge amounts of structured information about entities and
relationships, therefore plays a pivotal role in PPI extraction. This paper
proposes a knowledge-aware attention network (KAN) to fuse prior knowledge
about protein-protein pairs and context information for PPI extraction. The
proposed model first adopts a diagonal-disabled multi-head attention mechanism
to encode context sequence along with knowledge representations learned from
KB. Then a novel multi-dimensional attention mechanism is used to select the
features that can best describe the encoded context. Experiment results on the
BioCreative VI PPI dataset show that the proposed approach could acquire
knowledge-aware dependencies between different words in a sequence and lead to
a new state-of-the-art performance.
Related papers
- ProLLM: Protein Chain-of-Thoughts Enhanced LLM for Protein-Protein Interaction Prediction [54.132290875513405]
The prediction of protein-protein interactions (PPIs) is crucial for understanding biological functions and diseases.
Previous machine learning approaches to PPI prediction mainly focus on direct physical interactions.
We propose a novel framework ProLLM that employs an LLM tailored for PPI for the first time.
arXiv Detail & Related papers (2024-03-30T05:32:42Z) - Extracting Protein-Protein Interactions (PPIs) from Biomedical
Literature using Attention-based Relational Context Information [5.456047952635665]
This work presents a unified, multi-source PPI corpora with vetted interaction definitions augmented by binary interaction type labels.
A Transformer-based deep learning method exploits entities' relational context information for relation representation to improve relation classification performance.
The model's performance is evaluated on four widely studied biomedical relation extraction datasets.
arXiv Detail & Related papers (2024-03-08T01:43:21Z) - MAPE-PPI: Towards Effective and Efficient Protein-Protein Interaction
Prediction via Microenvironment-Aware Protein Embedding [82.31506767274841]
Protein-Protein Interactions (PPIs) are fundamental in various biological processes and play a key role in life activities.
MPAE-PPI encodes microenvironments into chemically meaningful discrete codes via a sufficiently large microenvironment "vocabulary"
MPAE-PPI can scale to PPI prediction with millions of PPIs with superior trade-offs between effectiveness and computational efficiency.
arXiv Detail & Related papers (2024-02-22T09:04:41Z) - Picking the Underused Heads: A Network Pruning Perspective of Attention
Head Selection for Fusing Dialogue Coreference Information [50.41829484199252]
Transformer-based models with the multi-head self-attention mechanism are widely used in natural language processing.
We investigate the attention head selection and manipulation strategy for feature injection from a network pruning perspective.
arXiv Detail & Related papers (2023-12-15T05:27:24Z) - UNTER: A Unified Knowledge Interface for Enhancing Pre-trained Language
Models [100.4659557650775]
We propose a UNified knowledge inTERface, UNTER, to provide a unified perspective to exploit both structured knowledge and unstructured knowledge.
With both forms of knowledge injected, UNTER gains continuous improvements on a series of knowledge-driven NLP tasks.
arXiv Detail & Related papers (2023-05-02T17:33:28Z) - Towards Unified AI Drug Discovery with Multiple Knowledge Modalities [5.232382666884214]
We propose KEDD, a unified, end-to-end, and multimodal deep learning framework.
It optimally incorporates both structured and unstructured knowledge for vast AI drug discovery tasks.
Our framework achieves a deeper understanding of molecule entities, brings significant improvements over state-of-the-art methods.
arXiv Detail & Related papers (2023-04-17T13:15:16Z) - Structure-aware Protein Self-supervised Learning [50.04673179816619]
We propose a novel structure-aware protein self-supervised learning method to capture structural information of proteins.
In particular, a well-designed graph neural network (GNN) model is pretrained to preserve the protein structural information.
We identify the relation between the sequential information in the protein language model and the structural information in the specially designed GNN model via a novel pseudo bi-level optimization scheme.
arXiv Detail & Related papers (2022-04-06T02:18:41Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - Joint Biomedical Entity and Relation Extraction with Knowledge-Enhanced
Collective Inference [42.255596963210564]
We present a novel framework that utilizes external knowledge for joint entity and relation extraction named KECI.
KeCI takes a collective approach to link mention spans to entities by integrating global relational information into local representations.
Our experimental results show that the framework is highly effective, achieving new state-of-the-art results in two different benchmark datasets.
arXiv Detail & Related papers (2021-05-27T21:33:34Z) - Leveraging Prior Knowledge for Protein-Protein Interaction Extraction
with Memory Network [3.67243903939214]
This paper proposes a novel memory network-based model (MNM) for PPI extraction.
The proposed MNM captures important context clues related to knowledge representations learned from knowledge bases.
arXiv Detail & Related papers (2020-01-07T15:11:27Z) - Chemical-induced Disease Relation Extraction with Dependency Information
and Prior Knowledge [2.9686294158279414]
We propose a novel convolutional attention network (CAN) for chemical-disease relation (CDR) extraction.
Firstly, we extract the shortest dependency path (SDP) between chemical and disease pairs in a sentence.
After that, an attention mechanism is employed to learn the importance/weight of each semantic dependency vector related to knowledge representations learned from KBs.
arXiv Detail & Related papers (2020-01-02T02:24:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.