Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy
- URL: http://arxiv.org/abs/2305.12396v2
- Date: Thu, 9 Nov 2023 13:01:15 GMT
- Title: Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy
- Authors: Lei Xu, Lei Chen, Rong Wang, Feiping Nie, Xuelong Li
- Abstract summary: We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
- Score: 103.74640329539389
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Feature selection (FS) plays an important role in machine learning, which
extracts important features and accelerates the learning process. In this
paper, we propose a deep FS method that simultaneously conducts feature
selection and differentiable $ k $-NN graph learning based on the Dirichlet
Energy. The Dirichlet Energy identifies important features by measuring their
smoothness on the graph structure, and facilitates the learning of a new graph
that reflects the inherent structure in new feature subspace. We employ Optimal
Transport theory to address the non-differentiability issue of learning $ k
$-NN graphs in neural networks, which theoretically makes our method applicable
to other graph neural networks for dynamic graph learning. Furthermore, the
proposed framework is interpretable, since all modules are designed
algorithmically. We validate the effectiveness of our model with extensive
experiments on both synthetic and real-world datasets.
Related papers
- Learning From Graph-Structured Data: Addressing Design Issues and Exploring Practical Applications in Graph Representation Learning [2.492884361833709]
We present an exhaustive review of the latest advancements in graph representation learning and Graph Neural Networks (GNNs)
GNNs, tailored to handle graph-structured data, excel in deriving insights and predictions from intricate relational information.
Our work delves into the capabilities of GNNs, examining their foundational designs and their application in addressing real-world challenges.
arXiv Detail & Related papers (2024-11-09T19:10:33Z) - Graph Reasoning Networks [9.18586425686959]
Graph Reasoning Networks (GRNs) is a novel approach to combine the strengths of fixed and learned graph representations and a reasoning module based on a differentiable satisfiability solver.
Results on real-world datasets show comparable performance to GNNs.
Experiments on synthetic datasets demonstrate the potential of the newly proposed method.
arXiv Detail & Related papers (2024-07-08T10:53:49Z) - Enhancing Graph Representation Learning with Attention-Driven Spiking Neural Networks [5.627287101959473]
Spiking neural networks (SNNs) have emerged as a promising alternative to traditional neural networks for graph learning tasks.
We propose a novel approach that integrates attention mechanisms with SNNs to improve graph representation learning.
arXiv Detail & Related papers (2024-03-25T12:15:10Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - GIF: A General Graph Unlearning Strategy via Influence Function [63.52038638220563]
Graph Influence Function (GIF) is a model-agnostic unlearning method that can efficiently and accurately estimate parameter changes in response to a $epsilon$-mass perturbation in deleted data.
We conduct extensive experiments on four representative GNN models and three benchmark datasets to justify GIF's superiority in terms of unlearning efficacy, model utility, and unlearning efficiency.
arXiv Detail & Related papers (2023-04-06T03:02:54Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Transfer Learning of Graph Neural Networks with Ego-graph Information
Maximization [41.867290324754094]
Graph neural networks (GNNs) have achieved superior performance in various applications, but training dedicated GNNs can be costly for large-scale graphs.
In this work, we establish a theoretically grounded and practically useful framework for the transfer learning of GNNs.
arXiv Detail & Related papers (2020-09-11T02:31:18Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.