Neural Markov Prolog
- URL: http://arxiv.org/abs/2312.01521v1
- Date: Mon, 27 Nov 2023 21:41:47 GMT
- Title: Neural Markov Prolog
- Authors: Alexander Thomson and David Page
- Abstract summary: We propose the language Neural Markov Prolog (NMP) as a means to bridge first order logic and neural network design.
NMP allows for the easy generation and presentation of architectures for images, text, relational databases, or other target data types.
- Score: 57.13568543360899
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recent rapid advance of AI has been driven largely by innovations in
neural network architectures. A concomitant concern is how to understand these
resulting systems. In this paper, we propose a tool to assist in both the
design of further innovative architectures and the simple yet precise
communication of their structure. We propose the language Neural Markov Prolog
(NMP), based on both Markov logic and Prolog, as a means to both bridge first
order logic and neural network design and to allow for the easy generation and
presentation of architectures for images, text, relational databases, or other
target data types or their mixtures.
Related papers
- Exploring knowledge graph-based neural-symbolic system from application perspective [0.0]
achieving human-like reasoning and interpretability in AI systems remains a substantial challenge.
The Neural-Symbolic paradigm, which integrates neural networks with symbolic systems, presents a promising pathway toward more interpretable AI.
This paper explores recent advancements in neural-symbolic integration based on Knowledge Graphs.
arXiv Detail & Related papers (2024-05-06T14:40:50Z) - Cyclic Neural Network [46.05071312173701]
We introduce the groundbreaking Cyclic Neural Networks (Cyclic NNs)
It emulates the flexible and dynamic graph nature of biological neural systems, allowing neuron connections in any graph-like structure, including cycles.
We develop the Graph Over Multi-layer Perceptron, which is the first detailed model based on this new design paradigm.
arXiv Detail & Related papers (2024-01-11T07:31:53Z) - COOL: A Constraint Object-Oriented Logic Programming Language and its
Neural-Symbolic Compilation System [0.0]
We introduce the COOL programming language, which seamlessly combines logical reasoning with neural network technologies.
COOL is engineered to autonomously handle data collection, mitigating the need for user-supplied initial data.
It incorporates user prompts into the coding process to reduce the risks of undertraining and enhances the interaction among models throughout their lifecycle.
arXiv Detail & Related papers (2023-11-07T06:29:59Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Neural Architecture Retrieval [27.063268631346713]
We define a new problem Neural Architecture Retrieval which retrieves a set of existing neural architectures with similar designs to the query neural architecture.
Existing graph pre-training strategies cannot address the computational graph in neural architectures due to the graph size and motifs.
We introduce multi-level contrastive learning to achieve accurate graph representation learning.
arXiv Detail & Related papers (2023-07-16T01:56:41Z) - Weisfeiler and Leman Go Relational [4.29881872550313]
We investigate the limitations in the expressive power of the well-known GCN and Composition GCN architectures.
We introduce the $k$-RN architecture that provably overcomes the limitations of the above two architectures.
arXiv Detail & Related papers (2022-11-30T15:56:46Z) - Model-Based Machine Learning for Communications [110.47840878388453]
We review existing strategies for combining model-based algorithms and machine learning from a high level perspective.
We focus on symbol detection, which is one of the fundamental tasks of communication receivers.
arXiv Detail & Related papers (2021-01-12T19:55:34Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z) - Improved Code Summarization via a Graph Neural Network [96.03715569092523]
In general, source code summarization techniques use the source code as input and outputs a natural language description.
We present an approach that uses a graph-based neural architecture that better matches the default structure of the AST to generate these summaries.
arXiv Detail & Related papers (2020-04-06T17:36:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.