Next Waves in Veridical Network Embedding
- URL: http://arxiv.org/abs/2007.05385v2
- Date: Fri, 13 Aug 2021 00:25:15 GMT
- Title: Next Waves in Veridical Network Embedding
- Authors: Owen G. Ward, Zhen Huang, Andrew Davison, Tian Zheng
- Abstract summary: We propose a framework for network embedding algorithms and discuss how the principles of predictability, computability and stability apply.
The utilization of this framework in network embedding holds the potential to motivate and point to new directions for future research.
- Score: 4.544287346584366
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Embedding nodes of a large network into a metric (e.g., Euclidean) space has
become an area of active research in statistical machine learning, which has
found applications in natural and social sciences. Generally, a representation
of a network object is learned in a Euclidean geometry and is then used for
subsequent tasks regarding the nodes and/or edges of the network, such as
community detection, node classification and link prediction. Network embedding
algorithms have been proposed in multiple disciplines, often with
domain-specific notations and details. In addition, different measures and
tools have been adopted to evaluate and compare the methods proposed under
different settings, often dependent of the downstream tasks. As a result, it is
challenging to study these algorithms in the literature systematically.
Motivated by the recently proposed Veridical Data Science (VDS) framework, we
propose a framework for network embedding algorithms and discuss how the
principles of predictability, computability and stability apply in this
context. The utilization of this framework in network embedding holds the
potential to motivate and point to new directions for future research.
Related papers
- Enhancing Community Detection in Networks: A Comparative Analysis of Local Metrics and Hierarchical Algorithms [49.1574468325115]
This study employs the same method to evaluate the relevance of using local similarity metrics for community detection.
The efficacy of these metrics was evaluated by applying the base algorithm to several real networks with varying community sizes.
arXiv Detail & Related papers (2024-08-17T02:17:09Z) - Low-Rank Representations Towards Classification Problem of Complex
Networks [0.0]
Complex networks representing social interactions, brain activities, molecular structures have been studied widely to be able to understand and predict their characteristics as graphs.
Models and algorithms for these networks are used in real-life applications, such as search engines, and recommender systems.
We study the performance of such low-rank representations of real-life networks on a network classification problem.
arXiv Detail & Related papers (2022-10-20T19:56:18Z) - Quasi-orthogonality and intrinsic dimensions as measures of learning and
generalisation [55.80128181112308]
We show that dimensionality and quasi-orthogonality of neural networks' feature space may jointly serve as network's performance discriminants.
Our findings suggest important relationships between the networks' final performance and properties of their randomly initialised feature spaces.
arXiv Detail & Related papers (2022-03-30T21:47:32Z) - Network Representation Learning: From Preprocessing, Feature Extraction
to Node Embedding [9.844802841686105]
Network representation learning (NRL) advances the conventional graph mining of social networks, knowledge graphs, and complex biomedical and physics information networks.
This survey paper reviews the design principles and the different node embedding techniques for network representation learning over homogeneous networks.
arXiv Detail & Related papers (2021-10-14T17:46:37Z) - Predicting Critical Nodes in Temporal Networks by Dynamic Graph
Convolutional Networks [1.213512753726579]
It is difficult to identify critical nodes because the network structure changes over time in temporal networks.
This paper proposes a novel and effective learning framework based on the combination of special GCNs and RNNs.
Experimental results on four real-world temporal networks demonstrate that the proposed method outperforms both traditional and deep learning benchmark methods.
arXiv Detail & Related papers (2021-06-19T04:16:18Z) - A Comprehensive Survey on Community Detection with Deep Learning [93.40332347374712]
A community reveals the features and connections of its members that are different from those in other communities in a network.
This survey devises and proposes a new taxonomy covering different categories of the state-of-the-art methods.
The main category, i.e., deep neural networks, is further divided into convolutional networks, graph attention networks, generative adversarial networks and autoencoders.
arXiv Detail & Related papers (2021-05-26T14:37:07Z) - Fusing the Old with the New: Learning Relative Camera Pose with
Geometry-Guided Uncertainty [91.0564497403256]
We present a novel framework that involves probabilistic fusion between the two families of predictions during network training.
Our network features a self-attention graph neural network, which drives the learning by enforcing strong interactions between different correspondences.
We propose motion parmeterizations suitable for learning and show that our method achieves state-of-the-art performance on the challenging DeMoN and ScanNet datasets.
arXiv Detail & Related papers (2021-04-16T17:59:06Z) - A Survey of Community Detection Approaches: From Statistical Modeling to
Deep Learning [95.27249880156256]
We develop and present a unified architecture of network community-finding methods.
We introduce a new taxonomy that divides the existing methods into two categories, namely probabilistic graphical model and deep learning.
We conclude with discussions of the challenges of the field and suggestions of possible directions for future research.
arXiv Detail & Related papers (2021-01-03T02:32:45Z) - Developing Constrained Neural Units Over Time [81.19349325749037]
This paper focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches.
The structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data.
The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner.
arXiv Detail & Related papers (2020-09-01T09:07:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.