Lecture Notes on Verifying Graph Neural Networks
- URL: http://arxiv.org/abs/2510.11617v1
- Date: Mon, 13 Oct 2025 16:57:20 GMT
- Title: Lecture Notes on Verifying Graph Neural Networks
- Authors: François Schwarzentruber,
- Abstract summary: We first recall the connection between graph neural networks and logics such as first-order logic and graded modal logic.<n>We then present a modal logic in which counting modalities appear in linear inequalities in order to solve verification tasks on graph neural networks.<n>We describe an algorithm for the satisfiability problem of that logic.
- Score: 10.812772606528172
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In these lecture notes, we first recall the connection between graph neural networks, Weisfeiler-Lehman tests and logics such as first-order logic and graded modal logic. We then present a modal logic in which counting modalities appear in linear inequalities in order to solve verification tasks on graph neural networks. We describe an algorithm for the satisfiability problem of that logic. It is inspired from the tableau method of vanilla modal logic, extended with reasoning in quantifier-free fragment Boolean algebra with Presburger arithmetic.
Related papers
- Query Languages for Machine-Learning Models [7.343886246061387]
I discuss two logics for weighted finite structures.<n>I present illustrative examples of queries to neural networks that can be expressed in these logics.
arXiv Detail & Related papers (2026-01-14T11:15:09Z) - From Neural Networks to Logical Theories: The Correspondence between Fibring Modal Logics and Fibring Neural Networks [17.474679381815026]
Fibring of modal logics is a well-established formalism for combining countable families of modal logics into a single fibred language.<n>Fibring of neural networks was introduced as a neurosymbolic framework for combining learning and reasoning in neural networks.
arXiv Detail & Related papers (2025-09-28T14:32:42Z) - The Correspondence Between Bounded Graph Neural Networks and Fragments of First-Order Logic [8.430502131775723]
We propose GNN architectures that correspond precisely to prominent fragments of first-order logic (FO)<n>Our results provide a unifying framework for understanding the logical expressiveness of GNNs within FO.
arXiv Detail & Related papers (2025-05-12T19:45:45Z) - Neural logic programs and neural nets [0.0]
We first define the answer set semantics of (boolean) neural nets and then introduce from first principles a class of neural logic programs and show that nets and programs are equivalent.
arXiv Detail & Related papers (2024-06-13T19:22:04Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine
Reading Comprehension [21.741085513119785]
Recent machine reading comprehension datasets such as ReClor and LogiQA require performing logical reasoning over text.
We present a neural-symbolic approach which, to predict an answer, passes messages over a graph representing logical relations between text units.
arXiv Detail & Related papers (2022-03-16T23:51:01Z) - Neuro-Symbolic Inductive Logic Programming with Logical Neural Networks [65.23508422635862]
We propose learning rules with the recently proposed logical neural networks (LNN)
Compared to others, LNNs offer strong connection to classical Boolean logic.
Our experiments on standard benchmarking tasks confirm that LNN rules are highly interpretable.
arXiv Detail & Related papers (2021-12-06T19:38:30Z) - Neural Logic Reasoning [47.622957656745356]
We propose Logic-Integrated Neural Network (LINN) to integrate the power of deep learning and logic reasoning.
LINN learns basic logical operations such as AND, OR, NOT as neural modules, and conducts propositional logical reasoning through the network for inference.
Experiments show that LINN significantly outperforms state-of-the-art recommendation models in Top-K recommendation.
arXiv Detail & Related papers (2020-08-20T14:53:23Z) - Logical Neural Networks [51.46602187496816]
We propose a novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning)
Every neuron has a meaning as a component of a formula in a weighted real-valued logic, yielding a highly intepretable disentangled representation.
Inference is omni rather than focused on predefined target variables, and corresponds to logical reasoning.
arXiv Detail & Related papers (2020-06-23T16:55:45Z) - Evaluating Logical Generalization in Graph Neural Networks [59.70452462833374]
We study the task of logical generalization using graph neural networks (GNNs)
Our benchmark suite, GraphLog, requires that learning algorithms perform rule induction in different synthetic logics.
We find that the ability for models to generalize and adapt is strongly determined by the diversity of the logical rules they encounter during training.
arXiv Detail & Related papers (2020-03-14T05:45:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.