Analogical Proportions
- URL: http://arxiv.org/abs/2006.02854v13
- Date: Wed, 24 Nov 2021 21:50:43 GMT
- Title: Analogical Proportions
- Authors: Christian Anti\'c
- Abstract summary: This paper introduces an abstract framework of analogical proportions of the form $a$ is to $b$ what $c$ is to $d$' in the general setting of universal algebra.
It turns out that our notion of analogical proportions has appealing mathematical properties.
This paper is a first step towards a theory of analogical reasoning and learning systems with potential applications to fundamental AI-problems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Analogy-making is at the core of human and artificial intelligence and
creativity with applications to such diverse tasks as proving mathematical
theorems and building mathematical theories, common sense reasoning, learning,
language acquisition, and story telling. This paper introduces from first
principles an abstract algebraic framework of analogical proportions of the
form `$a$ is to $b$ what $c$ is to $d$' in the general setting of universal
algebra. This enables us to compare mathematical objects possibly across
different domains in a uniform way which is crucial for AI-systems. It turns
out that our notion of analogical proportions has appealing mathematical
properties. As we construct our model from first principles using only
elementary concepts of universal algebra, and since our model questions some
basic properties of analogical proportions presupposed in the literature, to
convince the reader of the plausibility of our model we show that it can be
naturally embedded into first-order logic via model-theoretic types and prove
from that perspective that analogical proportions are compatible with
structure-preserving mappings. This provides conceptual evidence for its
applicability. In a broader sense, this paper is a first step towards a theory
of analogical reasoning and learning systems with potential applications to
fundamental AI-problems like common sense reasoning and computational learning
and creativity.
Related papers
- A Complexity-Based Theory of Compositionality [53.025566128892066]
In AI, compositional representations can enable a powerful form of out-of-distribution generalization.
Here, we propose a formal definition of compositionality that accounts for and extends our intuitions about compositionality.
The definition is conceptually simple, quantitative, grounded in algorithmic information theory, and applicable to any representation.
arXiv Detail & Related papers (2024-10-18T18:37:27Z) - Analogical proportions II [0.0]
Analogical reasoning is the ability to detect parallels between two seemingly distant objects or situations.
Analogical proportions are expressions of the form $a$ is to $b$ what $c$ is to $d$'' at the core of analogical reasoning.
arXiv Detail & Related papers (2024-05-22T09:02:12Z) - From Word Models to World Models: Translating from Natural Language to
the Probabilistic Language of Thought [124.40905824051079]
We propose rational meaning construction, a computational framework for language-informed thinking.
We frame linguistic meaning as a context-sensitive mapping from natural language into a probabilistic language of thought.
We show that LLMs can generate context-sensitive translations that capture pragmatically-appropriate linguistic meanings.
We extend our framework to integrate cognitively-motivated symbolic modules.
arXiv Detail & Related papers (2023-06-22T05:14:00Z) - A Simple Generative Model of Logical Reasoning and Statistical Learning [0.6853165736531939]
Statistical learning and logical reasoning are two major fields of AI expected to be unified for human-like machine intelligence.
We here propose a simple Bayesian model of logical reasoning and statistical learning.
We simply model how data causes symbolic knowledge in terms of its satisfiability in formal logic.
arXiv Detail & Related papers (2023-05-18T16:34:51Z) - Bilingual analogical proportions via hedges [0.0]
Analogical proportions are expressions of the form $a$ is to $b$ what $c$ is to $d$'' at the core of analogical reasoning.
The purpose of this paper is to generalize his unilingual framework to a bilingual one where the underlying languages may differ.
arXiv Detail & Related papers (2023-05-02T08:27:36Z) - Generalization-baed similarity [0.0]
We develop an abstract notion of similarity based on the observation that sets of generalizations encode important properties of elements.
We show that similarity defined in this way has appealing mathematical properties.
arXiv Detail & Related papers (2023-02-13T14:48:59Z) - A Survey of Deep Learning for Mathematical Reasoning [71.88150173381153]
We review the key tasks, datasets, and methods at the intersection of mathematical reasoning and deep learning over the past decade.
Recent advances in large-scale neural language models have opened up new benchmarks and opportunities to use deep learning for mathematical reasoning.
arXiv Detail & Related papers (2022-12-20T18:46:16Z) - Artificial Cognitively-inspired Generation of the Notion of Topological
Group in the Context of Artificial Mathematical Intelligence [0.0]
We provide the explicit artificial generation (or conceptual computation) for the fundamental mathematical notion of topological groups.
The concept of topological groups is explicitly generated through three different artificial specifications.
arXiv Detail & Related papers (2021-12-05T01:39:34Z) - Formalising Concepts as Grounded Abstractions [68.24080871981869]
This report shows how representation learning can be used to induce concepts from raw data.
The main technical goal of this report is to show how techniques from representation learning can be married with a lattice-theoretic formulation of conceptual spaces.
arXiv Detail & Related papers (2021-01-13T15:22:01Z) - Logical Neural Networks [51.46602187496816]
We propose a novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning)
Every neuron has a meaning as a component of a formula in a weighted real-valued logic, yielding a highly intepretable disentangled representation.
Inference is omni rather than focused on predefined target variables, and corresponds to logical reasoning.
arXiv Detail & Related papers (2020-06-23T16:55:45Z) - Machine Common Sense [77.34726150561087]
Machine common sense remains a broad, potentially unbounded problem in artificial intelligence (AI)
This article deals with the aspects of modeling commonsense reasoning focusing on such domain as interpersonal interactions.
arXiv Detail & Related papers (2020-06-15T13:59:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.