Operator Feature Neural Network for Symbolic Regression
- URL: http://arxiv.org/abs/2408.07719v1
- Date: Wed, 14 Aug 2024 09:47:13 GMT
- Title: Operator Feature Neural Network for Symbolic Regression
- Authors: Yusong Deng, Min Wu, Lina Yu, Jingyi Liu, Shu Wei, Yanjie Li, Weijun Li,
- Abstract summary: This paper introduces the operator feature neural network (OF-Net) which employs operator representation for expressions.
By substituting operator features for numeric loss, we can predict the combination of operators of target expressions.
We evaluate the model on public datasets, and the results demonstrate that the model achieves superior recovery rates and high $R2$ scores.
- Score: 11.341249704023687
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Symbolic regression is a task aimed at identifying patterns in data and representing them through mathematical expressions, generally involving skeleton prediction and constant optimization. Many methods have achieved some success, however they treat variables and symbols merely as characters of natural language without considering their mathematical essence. This paper introduces the operator feature neural network (OF-Net) which employs operator representation for expressions and proposes an implicit feature encoding method for the intrinsic mathematical operational logic of operators. By substituting operator features for numeric loss, we can predict the combination of operators of target expressions. We evaluate the model on public datasets, and the results demonstrate that the model achieves superior recovery rates and high $R^2$ scores. With the discussion of the results, we analyze the merit and demerit of OF-Net and propose optimizing schemes.
Related papers
- Parsing the Language of Expression: Enhancing Symbolic Regression with Domain-Aware Symbolic Priors [4.904996012808334]
We present an advanced symbolic regression method that integrates symbol priors from diverse scientific domains.
We propose novel tree-structured recurrent neural networks (RNNs) that leverage these symbol priors.
Experimental results demonstrate that leveraging symbol priors significantly enhances the performance of symbolic regression.
arXiv Detail & Related papers (2025-03-12T17:57:48Z) - SNIP: Bridging Mathematical Symbolic and Numeric Realms with Unified Pre-training [17.623227360825258]
We introduce SNIP, a Symbolic-Numeric Integrated Pre-training model.
By performing latent space analysis, we observe that SNIP provides cross-domain insights into the representations.
Results show that SNIP effectively transfers to various tasks, consistently outperforming fully supervised baselines.
arXiv Detail & Related papers (2023-10-03T17:32:44Z) - PROSE: Predicting Operators and Symbolic Expressions using Multimodal
Transformers [5.263113622394007]
We develop a new neural network framework for predicting differential equations.
By using a transformer structure and a feature fusion approach, our network can simultaneously embed sets of solution operators for various parametric differential equations.
The network is shown to be able to handle noise in the data and errors in the symbolic representation, including noisy numerical values, model misspecification, and erroneous addition or deletion of terms.
arXiv Detail & Related papers (2023-09-28T19:46:07Z) - Discrete Morphological Neural Networks [0.0]
We propose the Discrete Morphological Neural Networks (DMNN) for binary image analysis to represent W-operators.
As a proof-of-concept, we apply the DMNN to recognize the boundary of digits with noise.
arXiv Detail & Related papers (2023-09-01T17:04:48Z) - Neural-Symbolic Recursive Machine for Systematic Generalization [113.22455566135757]
We introduce the Neural-Symbolic Recursive Machine (NSR), whose core is a Grounded Symbol System (GSS)
NSR integrates neural perception, syntactic parsing, and semantic reasoning.
We evaluate NSR's efficacy across four challenging benchmarks designed to probe systematic generalization capabilities.
arXiv Detail & Related papers (2022-10-04T13:27:38Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Syntax-Aware Network for Handwritten Mathematical Expression Recognition [53.130826547287626]
Handwritten mathematical expression recognition (HMER) is a challenging task that has many potential applications.
Recent methods for HMER have achieved outstanding performance with an encoder-decoder architecture.
We propose a simple and efficient method for HMER, which is the first to incorporate syntax information into an encoder-decoder network.
arXiv Detail & Related papers (2022-03-03T09:57:19Z) - SymbolicGPT: A Generative Transformer Model for Symbolic Regression [3.685455441300801]
We present SymbolicGPT, a novel transformer-based language model for symbolic regression.
We show that our model performs strongly compared to competing models with respect to the accuracy, running time, and data efficiency.
arXiv Detail & Related papers (2021-06-27T03:26:35Z) - Discrete representations in neural models of spoken language [56.29049879393466]
We compare the merits of four commonly used metrics in the context of weakly supervised models of spoken language.
We find that the different evaluation metrics can give inconsistent results.
arXiv Detail & Related papers (2021-05-12T11:02:02Z) - Learning Symbolic Expressions: Mixed-Integer Formulations, Cuts, and
Heuristics [1.1602089225841632]
We consider the problem of learning a regression function without assuming its functional form.
We propose a that builds an expression tree by solving a restricted MI.
arXiv Detail & Related papers (2021-02-16T18:39:14Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Probing Linguistic Features of Sentence-Level Representations in Neural
Relation Extraction [80.38130122127882]
We introduce 14 probing tasks targeting linguistic properties relevant to neural relation extraction (RE)
We use them to study representations learned by more than 40 different encoder architecture and linguistic feature combinations trained on two datasets.
We find that the bias induced by the architecture and the inclusion of linguistic features are clearly expressed in the probing task performance.
arXiv Detail & Related papers (2020-04-17T09:17:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.