CrossWalk: Fairness-enhanced Node Representation Learning
- URL: http://arxiv.org/abs/2105.02725v1
- Date: Thu, 6 May 2021 14:45:34 GMT
- Title: CrossWalk: Fairness-enhanced Node Representation Learning
- Authors: Ahmad Khajehnejad, Moein Khajehnejad, Mahmoudreza Babaei, Krishna P.
Gummadi, Adrian Weller, Baharan Mirzasoleiman
- Abstract summary: We develop a simple, effective and general method, CrossWalk, that enhances fairness of various graph algorithms.
The key idea is to bias random walks to cross group boundaries, by upweighting edges which are closer to the groups' peripheries or (2) connect different groups in the network.
CrossWalk pulls nodes that are near groups' peripheries towards their neighbors from other groups.
- Score: 39.21278285140597
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The potential for machine learning systems to amplify social inequities and
unfairness is receiving increasing popular and academic attention. Much recent
work has focused on developing algorithmic tools to assess and mitigate such
unfairness. However, there is little work on enhancing fairness in graph
algorithms. Here, we develop a simple, effective and general method, CrossWalk,
that enhances fairness of various graph algorithms, including influence
maximization, link prediction and node classification, applied to node
embeddings. CrossWalk is applicable to any random walk based node
representation learning algorithm, such as DeepWalk and Node2Vec. The key idea
is to bias random walks to cross group boundaries, by upweighting edges which
(1) are closer to the groups' peripheries or (2) connect different groups in
the network. CrossWalk pulls nodes that are near groups' peripheries towards
their neighbors from other groups in the embedding space, while preserving the
necessary structural information from the graph. Extensive experiments show the
effectiveness of our algorithm to enhance fairness in various graph algorithms,
including influence maximization, link prediction and node classification in
synthetic and real networks, with only a very small decrease in performance.
Related papers
- Federated Graph Semantic and Structural Learning [54.97668931176513]
This paper reveals that local client distortion is brought by both node-level semantics and graph-level structure.
We postulate that a well-structural graph neural network possesses similarity for neighbors due to the inherent adjacency relationships.
We transform the adjacency relationships into the similarity distribution and leverage the global model to distill the relation knowledge into the local model.
arXiv Detail & Related papers (2024-06-27T07:08:28Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Community detection using low-dimensional network embedding algorithms [1.052782170493037]
We rigorously understand the performance of two major algorithms, DeepWalk and node2vec, in recovering communities for canonical network models.
We prove that, given some fixed co-occurrence window, node2vec using random walks with a low non-backtracking probability can succeed for much sparser networks.
arXiv Detail & Related papers (2021-11-04T14:57:43Z) - Degree-Based Random Walk Approach for Graph Embedding [0.0]
A computationally less intensive and node connectivity aware uniform sampling method is proposed.
The advantages of the proposed algorithm become more enhanced when the algorithm is applied to large graphs.
arXiv Detail & Related papers (2021-10-21T19:16:16Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Unsupervised Domain-adaptive Hash for Networks [81.49184987430333]
Domain-adaptive hash learning has enjoyed considerable success in the computer vision community.
We develop an unsupervised domain-adaptive hash learning method for networks, dubbed UDAH.
arXiv Detail & Related papers (2021-08-20T12:09:38Z) - Biased Edge Dropout for Enhancing Fairness in Graph Representation
Learning [14.664485680918725]
We propose a biased edge dropout algorithm (FairDrop) to counter-act homophily and improve fairness in graph representation learning.
FairDrop can be plugged in easily on many existing algorithms, is efficient, adaptable, and can be combined with other fairness-inducing solutions.
We prove that the proposed algorithm can successfully improve the fairness of all models up to a small or negligible drop in accuracy.
arXiv Detail & Related papers (2021-04-29T08:59:36Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Graph Neighborhood Attentive Pooling [0.5493410630077189]
Network representation learning (NRL) is a powerful technique for learning low-dimensional vector representation of high-dimensional and sparse graphs.
We propose a novel context-sensitive algorithm called GAP that learns to attend on different parts of a node's neighborhood using attentive pooling networks.
arXiv Detail & Related papers (2020-01-28T15:05:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.