Linguistic Inspired Graph Analysis
- URL: http://arxiv.org/abs/2105.06216v1
- Date: Thu, 13 May 2021 12:16:30 GMT
- Title: Linguistic Inspired Graph Analysis
- Authors: Andrew Broekman and Linda Marshall
- Abstract summary: Isomorphisms allow human cognition to transcribe a potentially unsolvable problem from one domain to a different domain.
Current approaches only focus on transcribing structural information from the source to target structure.
It is found that further work needs to be done to understand how graphs can be enriched to allow for isomorphisms to capture semantic and pragmatic information.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Isomorphisms allow human cognition to transcribe a potentially unsolvable
problem from one domain to a different domain where the problem might be more
easily addressed. Current approaches only focus on transcribing structural
information from the source to target structure, ignoring semantic and
pragmatic information. Functional Language Theory presents five subconstructs
for the classification and understanding of languages. By deriving a mapping
between the metamodels in linguistics and graph theory it will be shown that
currently, no constructs exist in canonical graphs for the representation of
semantic and pragmatic information. It is found that further work needs to be
done to understand how graphs can be enriched to allow for isomorphisms to
capture semantic and pragmatic information. This capturing of additional
information could lead to understandings of the source structure and enhanced
manipulations and interrogations of the contained relationships. Current
mathematical graph structures in their general definition do not allow for the
expression of higher information levels of a source.
Related papers
- Semantic Parsing for Question Answering over Knowledge Graphs [3.10647754288788]
We introduce a novel method with graph-to-segment mapping for question answering over knowledge graphs.
This method centers on semantic parsing, a key approach for interpreting these utterances.
Our framework employs a combination of rule-based and neural-based techniques to parse and construct semantic segment sequences.
arXiv Detail & Related papers (2023-12-01T20:45:06Z) - GPT4Graph: Can Large Language Models Understand Graph Structured Data ?
An Empirical Evaluation and Benchmarking [17.7473474499538]
Large language models like ChatGPT have become indispensable to artificial general intelligence.
In this study, we conduct an investigation to assess the proficiency of LLMs in comprehending graph data.
Our findings contribute valuable insights towards bridging the gap between language models and graph understanding.
arXiv Detail & Related papers (2023-05-24T11:53:19Z) - How Do Transformers Learn Topic Structure: Towards a Mechanistic
Understanding [56.222097640468306]
We provide mechanistic understanding of how transformers learn "semantic structure"
We show, through a combination of mathematical analysis and experiments on Wikipedia data, that the embedding layer and the self-attention layer encode the topical structure.
arXiv Detail & Related papers (2023-03-07T21:42:17Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - Language-Based Causal Representation Learning [24.008923963650226]
We show that the dynamics is learned over a suitable domain-independent first-order causal language.
The preference for the most compact representation in the language that is compatible with the data provides a strong and meaningful learning bias.
While "classical AI" requires handcrafted representations, similar representations can be learned from unstructured data over the same languages.
arXiv Detail & Related papers (2022-07-12T02:07:58Z) - Explanation Graph Generation via Pre-trained Language Models: An
Empirical Study with Contrastive Learning [84.35102534158621]
We study pre-trained language models that generate explanation graphs in an end-to-end manner.
We propose simple yet effective ways of graph perturbations via node and edge edit operations.
Our methods lead to significant improvements in both structural and semantic accuracy of explanation graphs.
arXiv Detail & Related papers (2022-04-11T00:58:27Z) - Taxonomy Enrichment with Text and Graph Vector Representations [61.814256012166794]
We address the problem of taxonomy enrichment which aims at adding new words to the existing taxonomy.
We present a new method that allows achieving high results on this task with little effort.
We achieve state-of-the-art results across different datasets and provide an in-depth error analysis of mistakes.
arXiv Detail & Related papers (2022-01-21T09:01:12Z) - Plurality and Quantification in Graph Representation of Meaning [4.82512586077023]
Our graph language covers the essentials of natural language semantics using only monadic second-order variables.
We present a unification-based mechanism for constructing semantic graphs at a simple syntax-semantics interface.
The present graph formalism is applied to linguistic issues in distributive predication, cross-categorial conjunction, and scope permutation of quantificational expressions.
arXiv Detail & Related papers (2021-12-13T07:04:41Z) - Graph-Structured Referring Expression Reasoning in The Wild [105.95488002374158]
Grounding referring expressions aims to locate in an image an object referred to by a natural language expression.
We propose a scene graph guided modular network (SGMN) to perform reasoning over a semantic graph and a scene graph.
We also propose Ref-Reasoning, a large-scale real-world dataset for structured referring expression reasoning.
arXiv Detail & Related papers (2020-04-19T11:00:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.