Symbolic Synthesis of Neural Networks
- URL: http://arxiv.org/abs/2303.03340v1
- Date: Mon, 6 Mar 2023 18:13:14 GMT
- Title: Symbolic Synthesis of Neural Networks
- Authors: Eli Whitehouse
- Abstract summary: I present Graph-basedally Synthesized Neural Networks (GSSNNs)
GSSNNs are a form of neural network whose topology and parameters are informed by the output of a symbolic program.
I demonstrate that by developing symbolic abstractions at a population level, I can elicit reliable patterns of improved generalization with small quantities of data known to contain local and discrete features.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks adapt very well to distributed and continuous
representations, but struggle to learn and generalize from small amounts of
data. Symbolic systems commonly achieve data efficient generalization by
exploiting modularity to benefit from local and discrete features of a
representation. These features allow symbolic programs to be improved one
module at a time and to experience combinatorial growth in the values they can
successfully process. However, it is difficult to design components that can be
used to form symbolic abstractions and which are highly-overparametrized like
neural networks, as the adjustment of parameters makes the semantics of modules
unstable. I present Graph-based Symbolically Synthesized Neural Networks
(G-SSNNs), a form of neural network whose topology and parameters are informed
by the output of a symbolic program. I demonstrate that by developing symbolic
abstractions at a population level, and applying gradient-based optimization to
such neural models at an individual level, I can elicit reliable patterns of
improved generalization with small quantities of data known to contain local
and discrete features. The paradigm embodied by G-SSNNs offers a route towards
the communal development of compact and composable abstractions which can be
flexibly repurposed for a variety of tasks and high-dimensional media. In
future work, I hope to pursue these benefits by exploring more ambitious G-SSNN
designs based on more complex classes of symbolic programs. The code and data
associated with the reported results are publicly available at
https://github.com/shlomenu/symbolically_synthesized_networks .
Related papers
- Steinmetz Neural Networks for Complex-Valued Data [23.80312814400945]
We introduce a new approach to processing complex-valued data using DNNs consisting of parallel real-valuedetzworks with coupled outputs.
Our proposed class of architectures, referred to as Steinmetz Neural Networks, leverage multi-view learning to construct more interpretable representations within the latent space.
Our numerical experiments depict the improved performance and to additive noise, afforded by these networks on benchmark datasets and synthetic examples.
arXiv Detail & Related papers (2024-09-16T08:26:06Z) - The Role of Foundation Models in Neuro-Symbolic Learning and Reasoning [54.56905063752427]
Neuro-Symbolic AI (NeSy) holds promise to ensure the safe deployment of AI systems.
Existing pipelines that train the neural and symbolic components sequentially require extensive labelling.
New architecture, NeSyGPT, fine-tunes a vision-language foundation model to extract symbolic features from raw data.
arXiv Detail & Related papers (2024-02-02T20:33:14Z) - BLIS-Net: Classifying and Analyzing Signals on Graphs [20.345611294709244]
Graph neural networks (GNNs) have emerged as a powerful tool for tasks such as node classification and graph classification.
We introduce the BLIS-Net (Bi-Lipschitz Scattering Net), a novel GNN that builds on the previously introduced geometric scattering transform.
We show that BLIS-Net achieves superior performance on both synthetic and real-world data sets based on traffic flow and fMRI data.
arXiv Detail & Related papers (2023-10-26T17:03:14Z) - Heterogenous Memory Augmented Neural Networks [84.29338268789684]
We introduce a novel heterogeneous memory augmentation approach for neural networks.
By introducing learnable memory tokens with attention mechanism, we can effectively boost performance without huge computational overhead.
We show our approach on various image and graph-based tasks under both in-distribution (ID) and out-of-distribution (OOD) conditions.
arXiv Detail & Related papers (2023-10-17T01:05:28Z) - Equivariant Matrix Function Neural Networks [1.8717045355288808]
We introduce Matrix Function Neural Networks (MFNs), a novel architecture that parameterizes non-local interactions through analytic matrix equivariant functions.
MFNs is able to capture intricate non-local interactions in quantum systems, paving the way to new state-of-the-art force fields.
arXiv Detail & Related papers (2023-10-16T14:17:00Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Logsig-RNN: a novel network for robust and efficient skeleton-based
action recognition [3.775860173040509]
We propose a novel module, namely Logsig-RNN, which is the combination of the log-native layer and recurrent type neural networks (RNNs)
In particular, we achieve the state-of-the-art accuracy on Chalearn2013 gesture data by combining simple path transformation layers with the Logsig-RNN.
arXiv Detail & Related papers (2021-10-25T14:47:15Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.