Modeling the Evolution of Networks as Shrinking Structural Diversity
- URL: http://arxiv.org/abs/2009.09764v1
- Date: Mon, 21 Sep 2020 11:30:07 GMT
- Title: Modeling the Evolution of Networks as Shrinking Structural Diversity
- Authors: J\'er\^ome Kunegis
- Abstract summary: This article reviews and evaluates models of network evolution based on the notion of structural diversity.
We show that diversity is an underlying theme of three principles of network evolution: the preferential attachment model, connectivity and link prediction.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This article reviews and evaluates models of network evolution based on the
notion of structural diversity. We show that diversity is an underlying theme
of three principles of network evolution: the preferential attachment model,
connectivity and link prediction. We show that in all three cases, a dominant
trend towards shrinking diversity is apparent, both theoretically and
empirically. In previous work, many kinds of different data have been modeled
as networks: social structure, navigational structure, transport
infrastructure, communication, etc. Almost all these types of networks are not
static structures, but instead dynamic systems that change continuously. Thus,
an important question concerns the trends observable in these networks and
their interpretation in terms of existing network models. We show in this
article that most numerical network characteristics follow statistically
significant trends going either up or down, and that these trends can be
predicted by considering the notion of diversity. Our work extends previous
work observing a shrinking network diameter to measures such as the clustering
coefficient, power-law exponent and random walk return probability, and
justifies preferential attachment models and link prediction algorithms. We
evaluate our hypothesis experimentally using a diverse collection of
twenty-seven temporally evolving real-world network datasets.
Related papers
- Transferable Post-training via Inverse Value Learning [83.75002867411263]
We propose modeling changes at the logits level during post-training using a separate neural network (i.e., the value network)
After training this network on a small base model using demonstrations, this network can be seamlessly integrated with other pre-trained models during inference.
We demonstrate that the resulting value network has broad transferability across pre-trained models of different parameter sizes.
arXiv Detail & Related papers (2024-10-28T13:48:43Z) - Improving Network Interpretability via Explanation Consistency Evaluation [56.14036428778861]
We propose a framework that acquires more explainable activation heatmaps and simultaneously increase the model performance.
Specifically, our framework introduces a new metric, i.e., explanation consistency, to reweight the training samples adaptively in model learning.
Our framework then promotes the model learning by paying closer attention to those training samples with a high difference in explanations.
arXiv Detail & Related papers (2024-08-08T17:20:08Z) - Visual Prompting Upgrades Neural Network Sparsification: A Data-Model Perspective [64.04617968947697]
We introduce a novel data-model co-design perspective: to promote superior weight sparsity.
Specifically, customized Visual Prompts are mounted to upgrade neural Network sparsification in our proposed VPNs framework.
arXiv Detail & Related papers (2023-12-03T13:50:24Z) - Latent Evolution Model for Change Point Detection in Time-varying
Networks [11.442584422147368]
Graph-based change point detection (CPD) play an irreplaceable role in discovering anomalous graphs in the time-varying network.
In practice, real-world graphs such as social networks, traffic networks, and rating networks are constantly evolving over time.
We propose a novel CPD method for dynamic graphs via a latent evolution model.
arXiv Detail & Related papers (2022-12-17T07:39:35Z) - Diversity and Generalization in Neural Network Ensembles [0.0]
We combine and expand previously published results in a theoretically sound framework that describes the relationship between diversity and ensemble performance.
We provide sound answers to the following questions: how to measure diversity, how diversity relates to the generalization error of an ensemble, and how diversity is promoted by neural network ensemble algorithms.
arXiv Detail & Related papers (2021-10-26T15:41:10Z) - Network Embedding via Deep Prediction Model [25.727377978617465]
This paper proposes a network embedding framework to capture the transfer behaviors on structured networks via deep prediction models.
A network structure embedding layer is added into conventional deep prediction models, including Long Short-Term Memory Network and Recurrent Neural Network.
Experimental studies are conducted on various datasets including social networks, citation networks, biomedical network, collaboration network and language network.
arXiv Detail & Related papers (2021-04-27T16:56:00Z) - Polynomial Networks in Deep Classifiers [55.90321402256631]
We cast the study of deep neural networks under a unifying framework.
Our framework provides insights on the inductive biases of each model.
The efficacy of the proposed models is evaluated on standard image and audio classification benchmarks.
arXiv Detail & Related papers (2021-04-16T06:41:20Z) - Dynamic Network Embedding Survey [11.742863376032112]
We give a survey of dynamic network embedding in this paper.
We present two basic data models, namely, discrete model and continuous model for dynamic networks.
We build a taxonomy that refines the category hierarchy by typical learning models.
arXiv Detail & Related papers (2021-03-29T09:27:53Z) - Understanding the wiring evolution in differentiable neural architecture
search [114.31723873105082]
Controversy exists on whether differentiable neural architecture search methods discover wiring topology effectively.
We study the underlying mechanism of several existing differentiable NAS frameworks.
arXiv Detail & Related papers (2020-09-02T18:08:34Z) - On Robustness and Transferability of Convolutional Neural Networks [147.71743081671508]
Modern deep convolutional networks (CNNs) are often criticized for not generalizing under distributional shifts.
We study the interplay between out-of-distribution and transfer performance of modern image classification CNNs for the first time.
We find that increasing both the training set and model sizes significantly improve the distributional shift robustness.
arXiv Detail & Related papers (2020-07-16T18:39:04Z) - Link Prediction for Temporally Consistent Networks [6.981204218036187]
Link prediction estimates the next relationship in dynamic networks.
The use of adjacency matrix to represent dynamically evolving networks limits the ability to analytically learn from heterogeneous, sparse, or forming networks.
We propose a new method of canonically representing heterogeneous time-evolving activities as a temporally parameterized network model.
arXiv Detail & Related papers (2020-06-06T07:28:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.