ZodiacEdge: a Datalog Engine With Incremental Rule Set Maintenance
- URL: http://arxiv.org/abs/2312.14530v1
- Date: Fri, 22 Dec 2023 08:53:48 GMT
- Title: ZodiacEdge: a Datalog Engine With Incremental Rule Set Maintenance
- Authors: Weiqin Xu and Olivier Cur\'e
- Abstract summary: We tackle the incremental maintenance of Datalog inference materialisation when the rule set can be updated.
This is particularly relevant in the context of the Internet of Things and Edge computing where smart devices may need to reason over newly acquired knowledge represented as Datalog rules.
Our solution is based on an adaptation of a stratification strategy applied to a dependency hypergraph whose nodes correspond to rule sets in a Datalog program.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we tackle the incremental maintenance of Datalog inference
materialisation when the rule set can be updated. This is particularly relevant
in the context of the Internet of Things and Edge computing where smart devices
may need to reason over newly acquired knowledge represented as Datalog rules.
Our solution is based on an adaptation of a stratification strategy applied to
a dependency hypergraph whose nodes correspond to rule sets in a Datalog
program. Our implementation supports recursive rules containing both negation
and aggregation. We demonstrate the effectiveness of our system on real and
synthetic data.
Related papers
- Fuzzy Datalog$^\exists$ over Arbitrary t-Norms [5.464669506214195]
One of the main challenges in the area of Neuro-Symbolic AI is to perform logical reasoning in the presence of both neural and symbolic data.
This requires combining heterogeneous data sources such as knowledge graphs, neural model predictions, structured databases, crowd-sourced data, and many more.
We generalise the standard rule-based language Datalog with existential rules to the setting, by allowing for arbitrary t-norms in the place of classical conjunctions in rule bodies.
The resulting formalism allows us to perform reasoning about associated data with degrees of uncertainty while preserving computational complexity results and the applicability of reasoning techniques established for
arXiv Detail & Related papers (2024-03-05T12:51:40Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Enhancing Datalog Reasoning with Hypertree Decompositions [17.868595375154506]
We provide algorithms that exploit hypertree decompositions for the materialisation and incremental evaluation of Datalog programs.
We combine this approach with standard Datalog reasoning algorithms in a modular fashion so that the overhead caused by the decompositions is reduced.
arXiv Detail & Related papers (2023-05-11T14:51:16Z) - Seminaive Materialisation in DatalogMTL [10.850687097496373]
DatalogMTL is an extension of Datalog with metric temporal operators.
We propose a materialisation-based procedure to minimise redundant computation.
Our experiments show that our optimised seminaive strategy for DatalogMTL is able to significantly reduce materialisation times.
arXiv Detail & Related papers (2022-08-15T10:04:44Z) - A Gaze into the Internal Logic of Graph Neural Networks, with Logic [0.0]
Graph Neural Networks share several key inference mechanisms with Logic Programming.
We show how to model the information flows involved in learning to infer from the link structure of a graph and the information content of its nodes properties of new nodes.
Our approach will consist in emulating with help of a Prolog program the key information propagation steps of a Graph Neural Network's training and inference stages.
As a practical outcome, we obtain a logic program, that, when seen as machine learning algorithm, performs close to the state of the art on the node property prediction benchmark.
arXiv Detail & Related papers (2022-08-05T10:49:21Z) - Complexity of Arithmetic in Warded Datalog+- [1.5469452301122173]
Warded Datalog+- extends the logic-based language Datalog with existential quantifiers in rule heads.
We define a new language that extends Warded Datalog+- with arithmetic and prove its P-completeness.
We present an efficient reasoning algorithm for our newly defined language and prove descriptive complexity for a recently introduced Datalog fragment with integer arithmetic.
arXiv Detail & Related papers (2022-02-10T15:14:03Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - DAGA: Data Augmentation with a Generation Approach for Low-resource
Tagging Tasks [88.62288327934499]
We propose a novel augmentation method with language models trained on the linearized labeled sentences.
Our method is applicable to both supervised and semi-supervised settings.
arXiv Detail & Related papers (2020-11-03T07:49:15Z) - An Integer Linear Programming Framework for Mining Constraints from Data [81.60135973848125]
We present a general framework for mining constraints from data.
In particular, we consider the inference in structured output prediction as an integer linear programming (ILP) problem.
We show that our approach can learn to solve 9x9 Sudoku puzzles and minimal spanning tree problems from examples without providing the underlying rules.
arXiv Detail & Related papers (2020-06-18T20:09:53Z) - Evaluating Logical Generalization in Graph Neural Networks [59.70452462833374]
We study the task of logical generalization using graph neural networks (GNNs)
Our benchmark suite, GraphLog, requires that learning algorithms perform rule induction in different synthetic logics.
We find that the ability for models to generalize and adapt is strongly determined by the diversity of the logical rules they encounter during training.
arXiv Detail & Related papers (2020-03-14T05:45:55Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.