Adversarial contamination of networks in the setting of vertex
nomination: a new trimming method
- URL: http://arxiv.org/abs/2208.09710v1
- Date: Sat, 20 Aug 2022 15:32:04 GMT
- Title: Adversarial contamination of networks in the setting of vertex
nomination: a new trimming method
- Authors: Sheyda Peyman, Minh Tang, Vince Lyzinski
- Abstract summary: spectral graph embeddings provide good algorithmic performance and flexible settings.
We propose a new trimming method that operates in model space which can address both block structure contamination and white noise contamination.
This model trimming is more amenable to theoretical analysis while also demonstrating superior performance in a number of simulations.
- Score: 5.915837770869619
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As graph data becomes more ubiquitous, the need for robust inferential graph
algorithms to operate in these complex data domains is crucial. In many cases
of interest, inference is further complicated by the presence of adversarial
data contamination. The effect of the adversary is frequently to change the
data distribution in ways that negatively affect statistical and algorithmic
performance. We study this phenomenon in the context of vertex nomination, a
semi-supervised information retrieval task for network data. Here, a common
suite of methods relies on spectral graph embeddings, which have been shown to
provide both good algorithmic performance and flexible settings in which
regularization techniques can be implemented to help mitigate the effect of an
adversary. Many current regularization methods rely on direct network trimming
to effectively excise the adversarial contamination, although this direct
trimming often gives rise to complicated dependency structures in the resulting
graph. We propose a new trimming method that operates in model space which can
address both block structure contamination and white noise contamination
(contamination whose distribution is unknown). This model trimming is more
amenable to theoretical analysis while also demonstrating superior performance
in a number of simulations, compared to direct trimming.
Related papers
- Modularity aided consistent attributed graph clustering via coarsening [6.522020196906943]
Graph clustering is an important unsupervised learning technique for partitioning graphs with attributes and detecting communities.
We propose a loss function incorporating log-determinant, smoothness, and modularity components using a block majorization-minimization technique.
Our algorithm seamlessly integrates graph neural networks (GNNs) and variational graph autoencoders (VGAEs) to learn enhanced node features and deliver exceptional clustering performance.
arXiv Detail & Related papers (2024-07-09T10:42:19Z) - Hierarchical Over-the-Air Federated Learning with Awareness of
Interference and Data Heterogeneity [3.8798345704175534]
We introduce a scalable transmission scheme that efficiently uses a single wireless resource through over-the-air computation.
We show that despite the interference and the data heterogeneity, the proposed scheme achieves high learning accuracy and can significantly outperform the conventional hierarchical algorithm.
arXiv Detail & Related papers (2024-01-02T21:43:01Z) - Probabilistically Rewired Message-Passing Neural Networks [41.554499944141654]
Message-passing graph neural networks (MPNNs) emerged as powerful tools for processing graph-structured input.
MPNNs operate on a fixed input graph structure, ignoring potential noise and missing information.
We devise probabilistically rewired MPNNs (PR-MPNNs) which learn to add relevant edges while omitting less beneficial ones.
arXiv Detail & Related papers (2023-10-03T15:43:59Z) - CONVERT:Contrastive Graph Clustering with Reliable Augmentation [110.46658439733106]
We propose a novel CONtrastiVe Graph ClustEring network with Reliable AugmenTation (CONVERT)
In our method, the data augmentations are processed by the proposed reversible perturb-recover network.
To further guarantee the reliability of semantics, a novel semantic loss is presented to constrain the network.
arXiv Detail & Related papers (2023-08-17T13:07:09Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Latent Network Embedding via Adversarial Auto-encoders [15.656374849760734]
We propose a latent network embedding model based on adversarial graph auto-encoders.
Under this framework, the problem of discovering latent structures is formulated as inferring the latent ties from partial observations.
arXiv Detail & Related papers (2021-09-30T16:49:46Z) - Data Augmentation Through Monte Carlo Arithmetic Leads to More
Generalizable Classification in Connectomics [0.0]
We use Monte Carlo Arithmetic to perturb a structural connectome estimation pipeline.
The perturbed networks were captured in an augmented dataset, which was then used for an age classification task.
We find that this benefit does not hinge on a large number of perturbations, suggesting that even minimally perturbing a dataset adds meaningful variance which can be captured in the subsequently designed models.
arXiv Detail & Related papers (2021-09-20T16:06:05Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.