Universal Topological Regularities of Syntactic Structures: Decoupling
Efficiency from Optimization
- URL: http://arxiv.org/abs/2302.00129v1
- Date: Tue, 31 Jan 2023 22:35:11 GMT
- Title: Universal Topological Regularities of Syntactic Structures: Decoupling
Efficiency from Optimization
- Authors: Ferm\'in Moscoso del Prado Mart\'in
- Abstract summary: This study investigates how the topologies of syntactic graphs reveal traces of the processes that led to their emergence.
I report a new universal regularity in syntactic structures: Their topology is communicatively efficient above chance.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human syntactic structures are usually represented as graphs. Much research
has focused on the mapping between such graphs and linguistic sequences, but
less attention has been paid to the shapes of the graphs themselves: their
topologies. This study investigates how the topologies of syntactic graphs
reveal traces of the processes that led to their emergence. I report a new
universal regularity in syntactic structures: Their topology is communicatively
efficient above chance. The pattern holds, without exception, for all 124
languages studied, across linguistic families and modalities (spoken, written,
and signed). This pattern can arise from a process optimizing for communicative
efficiency or, alternatively, by construction, as a by-effect of a sublinear
preferential attachment process reflecting language production mechanisms known
from psycholinguistics. This dual explanation shows how communicative
efficiency, per se, does not require optimization. Among the two options,
efficiency without optimization offers the better explanation for the new
pattern.
Related papers
- S$^2$GSL: Incorporating Segment to Syntactic Enhanced Graph Structure Learning for Aspect-based Sentiment Analysis [19.740223755240734]
We propose S$2$GSL, incorporating Segment to Syntactic enhanced Graph Structure Learning for ABSA.
S$2$GSL is featured with a segment-aware semantic graph learning and a syntax-based latent graph learning.
arXiv Detail & Related papers (2024-06-05T03:44:35Z) - Bayesian Optimization of Functions over Node Subsets in Graphs [14.670181702535825]
We propose a novel framework for optimization on graphs.
We map each $k$-node in the original graph to a node in a new graph.
Experiments under both synthetic and real-world setups demonstrate the effectiveness of the proposed BO framework.
arXiv Detail & Related papers (2024-05-24T00:24:55Z) - Semantic Random Walk for Graph Representation Learning in Attributed
Graphs [2.318473106845779]
We propose a novel semantic graph representation (SGR) method to formulate the joint optimization of the two heterogeneous sources into a common high-order proximity based framework.
Conventional embedding methods that consider high-order topology proximities can then be easily applied to the newly constructed graph to learn the representations of both node and attribute.
The learned attribute embeddings can also effectively support some semantic-oriented inference tasks, helping to reveal the graph's deep semantic.
arXiv Detail & Related papers (2023-05-11T02:35:16Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Learning to Learn Graph Topologies [27.782971146122218]
We learn a mapping from node data to the graph structure based on the idea of learning to optimise (L2O)
The model is trained in an end-to-end fashion with pairs of node data and graph samples.
Experiments on both synthetic and real-world data demonstrate that our model is more efficient than classic iterative algorithms in learning a graph with specific topological properties.
arXiv Detail & Related papers (2021-10-19T08:42:38Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Infusing Finetuning with Semantic Dependencies [62.37697048781823]
We show that, unlike syntax, semantics is not brought to the surface by today's pretrained models.
We then use convolutional graph encoders to explicitly incorporate semantic parses into task-specific finetuning.
arXiv Detail & Related papers (2020-12-10T01:27:24Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.