Emergence of Scale-Free Networks in Social Interactions among Large
Language Models
- URL: http://arxiv.org/abs/2312.06619v1
- Date: Mon, 11 Dec 2023 18:43:16 GMT
- Title: Emergence of Scale-Free Networks in Social Interactions among Large
Language Models
- Authors: Giordano De Marzo, Luciano Pietronero, David Garcia
- Abstract summary: We analyze the interactions of multiple generative agents using GPT3.5-turbo as a language model.
We show how renaming agents removes these token priors and allows the model to generate a range of networks from random networks to more realistic scale-free networks.
- Score: 0.43967817176834806
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scale-free networks are one of the most famous examples of emergent behavior
and are ubiquitous in social systems, especially online social media in which
users can follow each other. By analyzing the interactions of multiple
generative agents using GPT3.5-turbo as a language model, we demonstrate their
ability to not only mimic individual human linguistic behavior but also exhibit
collective phenomena intrinsic to human societies, in particular the emergence
of scale-free networks. We discovered that this process is disrupted by a
skewed token prior distribution of GPT3.5-turbo, which can lead to networks
with extreme centralization as a kind of alignment. We show how renaming agents
removes these token priors and allows the model to generate a range of networks
from random networks to more realistic scale-free networks.
Related papers
- Matrix-weighted networks for modeling multidimensional dynamics [5.257502867974538]
We propose a novel, general framework for modeling multidimensional interacting dynamics: matrix-weighted networks (MWNs)
We present the mathematical foundations of MWNs and examine consensus dynamics and random walks within this context.
Our results reveal that the coherence of MWNs gives rise to non-trivial steady states that generalize the notions of communities and structural balance in traditional networks.
arXiv Detail & Related papers (2024-10-07T16:47:30Z) - Leveraging Low-Rank and Sparse Recurrent Connectivity for Robust
Closed-Loop Control [63.310780486820796]
We show how a parameterization of recurrent connectivity influences robustness in closed-loop settings.
We find that closed-form continuous-time neural networks (CfCs) with fewer parameters can outperform their full-rank, fully-connected counterparts.
arXiv Detail & Related papers (2023-10-05T21:44:18Z) - Modeling Random Networks with Heterogeneous Reciprocity [9.630755176298056]
We develop methodology to model the diverse reciprocal behavior in growing social networks.
We present a preferential attachment model with heterogeneous reciprocity that imitates the attraction users have for popular users.
We apply the presented methods to the analysis of a Facebook wallpost network where users have non-uniform reciprocal behavior patterns.
arXiv Detail & Related papers (2023-08-19T21:21:25Z) - Generalization and Estimation Error Bounds for Model-based Neural
Networks [78.88759757988761]
We show that the generalization abilities of model-based networks for sparse recovery outperform those of regular ReLU networks.
We derive practical design rules that allow to construct model-based networks with guaranteed high generalization.
arXiv Detail & Related papers (2023-04-19T16:39:44Z) - Fitting Low-rank Models on Egocentrically Sampled Partial Networks [4.111899441919165]
We propose an approach to fit general low-rank models for egocentrically sampled networks.
This method offers the first theoretical guarantee for egocentric partial network estimation.
We evaluate the technique on several synthetic and real-world networks and show that it delivers competitive performance in link prediction tasks.
arXiv Detail & Related papers (2023-03-09T03:20:44Z) - Characterizing Polarization in Social Networks using the Signed
Relational Latent Distance Model [0.0]
A major current concern in social networks is the emergence of polarization and filter bubbles promoting a mindset of "us-versus-them"
We propose the latent Signed relational Latent dIstance Model (SLIM) utilizing for the first time the Skellam distribution as a likelihood function for signed networks.
We demonstrate that the model extracts low-dimensional characterizations that well predict friendships and animosity while providing interpretable visualizations defined by extreme positions.
arXiv Detail & Related papers (2023-01-23T16:01:26Z) - Predicting Hidden Links and Missing Nodes in Scale-Free Networks with
Artificial Neural Networks [1.0152838128195467]
We proposed a methodology in a form of an algorithm to predict hidden links and missing nodes in scale-free networks.
We used Bela Bollobas's directed scale-free random graph generation algorithm as a generator of random networks to generate a large set of scale-free network's data.
arXiv Detail & Related papers (2021-09-25T10:23:28Z) - A Multi-Semantic Metapath Model for Large Scale Heterogeneous Network
Representation Learning [52.83948119677194]
We propose a multi-semantic metapath (MSM) model for large scale heterogeneous representation learning.
Specifically, we generate multi-semantic metapath-based random walks to construct the heterogeneous neighborhood to handle the unbalanced distributions.
We conduct systematical evaluations for the proposed framework on two challenging datasets: Amazon and Alibaba.
arXiv Detail & Related papers (2020-07-19T22:50:20Z) - Firearm Detection and Segmentation Using an Ensemble of Semantic Neural
Networks [62.997667081978825]
We present a weapon detection system based on an ensemble of semantic Convolutional Neural Networks.
A set of simpler neural networks dedicated to specific tasks requires less computational resources and can be trained in parallel.
The overall output of the system given by the aggregation of the outputs of individual networks can be tuned by a user to trade-off false positives and false negatives.
arXiv Detail & Related papers (2020-02-11T13:58:16Z) - I Know Where You Are Coming From: On the Impact of Social Media Sources
on AI Model Performance [79.05613148641018]
We will study the performance of different machine learning models when being learned on multi-modal data from different social networks.
Our initial experimental results reveal that social network choice impacts the performance.
arXiv Detail & Related papers (2020-02-05T11:10:44Z) - DiffNet++: A Neural Influence and Interest Diffusion Network for Social
Recommendation [50.08581302050378]
Social recommendation has emerged to leverage social connections among users for predicting users' unknown preferences.
We propose a preliminary work of a neural influence diffusion network (i.e., DiffNet) for social recommendation (Diffnet)
In this paper, we propose DiffNet++, an improved algorithm of Diffnet that models the neural influence diffusion and interest diffusion in a unified framework.
arXiv Detail & Related papers (2020-01-15T08:45:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.