Boolean-aware Boolean Circuit Classification: A Comprehensive Study on Graph Neural Network
- URL: http://arxiv.org/abs/2411.10481v1
- Date: Wed, 13 Nov 2024 08:38:21 GMT
- Title: Boolean-aware Boolean Circuit Classification: A Comprehensive Study on Graph Neural Network
- Authors: Liwei Ni, Xinquan Li, Biwei Xie, Huawei Li,
- Abstract summary: The graph structure-based Boolean circuit classification can be grouped into the graph classification task.
We first define the proposed matching-equivalent class based on its Boolean-aware'' property.
We present a commonly study framework based on graph neural network(GNN) to analyze the key factors that can affect the Boolean-aware circuit classification.
- Score: 2.1080766959962625
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Boolean circuit is a computational graph that consists of the dynamic directed graph structure and static functionality. The commonly used logic optimization and Boolean matching-based transformation can change the behavior of the Boolean circuit for its graph structure and functionality in logic synthesis. The graph structure-based Boolean circuit classification can be grouped into the graph classification task, however, the functionality-based Boolean circuit classification remains an open problem for further research. In this paper, we first define the proposed matching-equivalent class based on its ``Boolean-aware'' property. The Boolean circuits in the proposed class can be transformed into each other. Then, we present a commonly study framework based on graph neural network~(GNN) to analyze the key factors that can affect the Boolean-aware Boolean circuit classification. The empirical experiment results verify the proposed analysis, and it also shows the direction and opportunity to improve the proposed problem. The code and dataset will be released after acceptance.
Related papers
- Boolean Product Graph Neural Networks [8.392545965667288]
Graph Neural Networks (GNNs) have recently achieved significant success, with a key operation involving the aggregation of information from neighboring nodes.
This paper proposes a novel Boolean product-based graph residual connection in GNNs to link the latent graph and the original graph.
We validate the proposed method in benchmark datasets and demonstrate its ability to enhance the performance and robustness of GNNs.
arXiv Detail & Related papers (2024-09-21T03:31:33Z) - Boolean Logic as an Error feedback mechanism [0.5439020425819]
The notion of Boolean logic backpagation was introduced to build neural networks with weights and activations being Boolean numbers.
Most of computations can be done with logic instead of real arithmetic during training and phases.
arXiv Detail & Related papers (2024-01-29T18:56:21Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Learning to Reason with Neural Networks: Generalization, Unseen Data and
Boolean Measures [44.87247707099189]
This paper considers the Pointer Value Retrieval (PVR) benchmark introduced in [ZRKB21], where a'reasoning' function acts on a string of digits to produce the label.
It is first shown that in order to learn logical functions with gradient descent on symmetric neural networks, the generalization error can be lower-bounded in terms of the noise-stability of the target function.
arXiv Detail & Related papers (2022-05-26T21:53:47Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Reverse Derivative Ascent: A Categorical Approach to Learning Boolean
Circuits [0.0]
We introduce Reverse Derivative Ascent: a categorical analogue of gradient based methods for machine learning.
Our motivating example is reverse circuits: we show how our algorithm can be applied to such circuits by using the theory of reverse differential categories.
We demonstrate its empirical value by giving experimental results on benchmark machine learning datasets.
arXiv Detail & Related papers (2021-01-26T00:07:20Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z) - LogicalFactChecker: Leveraging Logical Operations for Fact Checking with
Graph Module Network [111.24773949467567]
We propose LogicalFactChecker, a neural network approach capable of leveraging logical operations for fact checking.
It achieves the state-of-the-art performance on TABFACT, a large-scale, benchmark dataset.
arXiv Detail & Related papers (2020-04-28T17:04:19Z) - Evaluating Logical Generalization in Graph Neural Networks [59.70452462833374]
We study the task of logical generalization using graph neural networks (GNNs)
Our benchmark suite, GraphLog, requires that learning algorithms perform rule induction in different synthetic logics.
We find that the ability for models to generalize and adapt is strongly determined by the diversity of the logical rules they encounter during training.
arXiv Detail & Related papers (2020-03-14T05:45:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.