FairGAT: Fairness-aware Graph Attention Networks
- URL: http://arxiv.org/abs/2303.14591v1
- Date: Sun, 26 Mar 2023 00:10:20 GMT
- Title: FairGAT: Fairness-aware Graph Attention Networks
- Authors: O. Deniz Kose, Yanning Shen
- Abstract summary: Graph attention networks (GATs) have become one of the most widely utilized neural network structures for graph-based tasks.
The influence of the attention design in GATs on algorithmic bias has not been investigated.
A novel algorithm, FairGAT, that leverages a fairness-aware attention design is developed.
- Score: 9.492903649862761
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs can facilitate modeling various complex systems such as gene networks
and power grids, as well as analyzing the underlying relations within them.
Learning over graphs has recently attracted increasing attention, particularly
graph neural network-based (GNN) solutions, among which graph attention
networks (GATs) have become one of the most widely utilized neural network
structures for graph-based tasks. Although it is shown that the use of graph
structures in learning results in the amplification of algorithmic bias, the
influence of the attention design in GATs on algorithmic bias has not been
investigated. Motivated by this, the present study first carries out a
theoretical analysis in order to demonstrate the sources of algorithmic bias in
GAT-based learning for node classification. Then, a novel algorithm, FairGAT,
that leverages a fairness-aware attention design is developed based on the
theoretical findings. Experimental results on real-world networks demonstrate
that FairGAT improves group fairness measures while also providing comparable
utility to the fairness-aware baselines for node classification and link
prediction.
Related papers
- GLANCE: Graph Logic Attention Network with Cluster Enhancement for Heterophilous Graph Representation Learning [54.60090631330295]
Graph Neural Networks (GNNs) have demonstrated significant success in learning from graph-structured data but often struggle on heterophilous graphs.<n>We propose GLANCE, a novel framework that integrates logic-guided reasoning, dynamic graph refinement, and adaptive clustering to enhance graph representation learning.
arXiv Detail & Related papers (2025-07-24T15:45:26Z) - Research on the application of graph data structure and graph neural network in node classification/clustering tasks [13.51508928671878]
Graph-structured data are pervasive across domains including social networks, biological networks, and knowledge graphs.<n>Due to their non-Euclidean nature, such data pose significant challenges to conventional machine learning methods.<n>This study investigates graph data structures, classical graph algorithms, and Graph Neural Networks (GNNs)
arXiv Detail & Related papers (2025-07-20T12:57:23Z) - Graph Collaborative Attention Network for Link Prediction in Knowledge Graphs [0.0]
We focus on KBGAT, a graph neural network model that leverages multi-head attention to jointly encode both entity and relation features within local neighborhood structures.<n>We introduce textbfGCAT (Graph Collaborative Attention Network), a refined model that enhances context aggregation and interaction between heterogeneous nodes.<n>Our findings highlight the advantages of attention-based architectures in capturing complex relational patterns for knowledge graph completion tasks.
arXiv Detail & Related papers (2025-07-05T08:13:09Z) - Graph Reasoning Networks [9.18586425686959]
Graph Reasoning Networks (GRNs) is a novel approach to combine the strengths of fixed and learned graph representations and a reasoning module based on a differentiable satisfiability solver.
Results on real-world datasets show comparable performance to GNNs.
Experiments on synthetic datasets demonstrate the potential of the newly proposed method.
arXiv Detail & Related papers (2024-07-08T10:53:49Z) - Fair Graph Neural Network with Supervised Contrastive Regularization [12.666235467177131]
We propose a novel model for training fairness-aware Graph Neural Networks (GNNs)
Our approach integrates Supervised Contrastive Loss and Environmental Loss to enhance both accuracy and fairness.
arXiv Detail & Related papers (2024-04-09T07:49:05Z) - Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Generative Graph Neural Networks for Link Prediction [13.643916060589463]
Inferring missing links or detecting spurious ones based on observed graphs, known as link prediction, is a long-standing challenge in graph data analysis.
This paper proposes a novel and radically different link prediction algorithm based on the network reconstruction theory, called GraphLP.
Unlike the discriminative neural network models used for link prediction, GraphLP is generative, which provides a new paradigm for neural-network-based link prediction.
arXiv Detail & Related papers (2022-12-31T10:07:19Z) - An Empirical Study of Retrieval-enhanced Graph Neural Networks [48.99347386689936]
Graph Neural Networks (GNNs) are effective tools for graph representation learning.
We propose a retrieval-enhanced scheme called GRAPHRETRIEVAL, which is agnostic to the choice of graph neural network models.
We conduct comprehensive experiments over 13 datasets, and we observe that GRAPHRETRIEVAL is able to reach substantial improvements over existing GNNs.
arXiv Detail & Related papers (2022-06-01T09:59:09Z) - Theory of Graph Neural Networks: Representation and Learning [44.02161831977037]
Graph Neural Networks (GNNs) have become a popular learning model for prediction tasks on nodes, graphs and configurations of points.
This article summarizes a selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and higher-order GNNs.
arXiv Detail & Related papers (2022-04-16T02:08:50Z) - Fair Node Representation Learning via Adaptive Data Augmentation [9.492903649862761]
This work theoretically explains the sources of bias in node representations obtained via Graph Neural Networks (GNNs)
Building upon the analysis, fairness-aware data augmentation frameworks are developed to reduce the intrinsic bias.
Our analysis and proposed schemes can be readily employed to enhance the fairness of various GNN-based learning mechanisms.
arXiv Detail & Related papers (2022-01-21T05:49:15Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.