Transformation-Interaction-Rational Representation for Symbolic
Regression
- URL: http://arxiv.org/abs/2205.06807v1
- Date: Mon, 25 Apr 2022 16:53:43 GMT
- Title: Transformation-Interaction-Rational Representation for Symbolic
Regression
- Authors: Fabricio Olivetti de Franca
- Abstract summary: Symbolic Regression searches for a function form that approximates a dataset often using Genetic Programming.
A novel representation called Interaction-Transformation was recently proposed to alleviate this problem.
We propose an extension to this representation that defines a new function form as the rational of two Interaction-Transformation functions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Symbolic Regression searches for a function form that approximates a dataset
often using Genetic Programming.
Since there is usually no restriction to what form the function can have,
Genetic Programming may return a hard to understand model due to non-linear
function chaining or long expressions. A novel representation called
Interaction-Transformation was recently proposed to alleviate this problem. In
this representation, the function form is restricted to an affine combination
of terms generated as the application of a single univariate function to the
interaction of selected variables. This representation obtained competing
solutions on standard benchmarks. Despite the initial success, a broader set of
benchmarking functions revealed the limitations of the constrained
representation.
In this paper we propose an extension to this representation, called
Transformation-Interaction-Rational representation that defines a new function
form as the rational of two Interaction-Transformation functions. Additionally,
the target variable can also be transformed with an univariate function. The
main goal is to improve the approximation power while still constraining the
overall complexity of the expression.
We tested this representation with a standard Genetic Programming with
crossover and mutation. The results show a great improvement when compared to
its predecessor and a state-of-the-art performance for a large benchmark.
Related papers
- Alleviating Overfitting in Transformation-Interaction-Rational Symbolic Regression with Multi-Objective Optimization [0.0]
The performance of using Genetic Programming with the Transformation-Interaction-Rational representation was substantially better than with its predecessor.
We extend Transformation-Interaction-Rational to support multi-objective optimization, specifically the NSGA-II algorithm, and apply that to the same benchmark.
arXiv Detail & Related papers (2025-01-03T17:21:05Z) - EulerFormer: Sequential User Behavior Modeling with Complex Vector Attention [88.45459681677369]
We propose a novel transformer variant with complex vector attention, named EulerFormer.
It provides a unified theoretical framework to formulate both semantic difference and positional difference.
It is more robust to semantic variations and possesses moresuperior theoretical properties in principle.
arXiv Detail & Related papers (2024-03-26T14:18:43Z) - Data-driven path collective variables [0.0]
We propose a new method for the generation, optimization, and comparison of collective variables.
The resulting collective variable is one-dimensional, interpretable, and differentiable.
We demonstrate the validity of the method on two different applications.
arXiv Detail & Related papers (2023-12-21T14:07:47Z) - Constrained Optimization of Rank-One Functions with Indicator Variables [0.0]
optimization problems involving a rank-one convex function over constraints modeling restrictions on the support of the decision variables emerge in various machine learning applications.
We propose a constructive approach that exploits a hidden conic structure induced by perspective functions.
This enables us to systematically give perspective formulations for the convex hull descriptions of sets with nonlinear separable or non-separable objective functions, sign constraints on continuous variables, and constraints on indicator variables.
arXiv Detail & Related papers (2023-03-31T15:51:56Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - End-to-end symbolic regression with transformers [20.172752966322214]
Symbolic magnitude regression is a difficult task which usually involves predicting the two-step procedure faster.
We show that our model approaches the end-to-end approach Neural the constants as an informed Transformer.
arXiv Detail & Related papers (2022-04-22T06:55:43Z) - X-volution: On the unification of convolution and self-attention [52.80459687846842]
We propose a multi-branch elementary module composed of both convolution and self-attention operation.
The proposed X-volution achieves highly competitive visual understanding improvements.
arXiv Detail & Related papers (2021-06-04T04:32:02Z) - Better Regularization for Sequential Decision Spaces: Fast Convergence
Rates for Nash, Correlated, and Team Equilibria [121.36609493711292]
We study the application of iterative first-order methods to the problem of computing equilibria of large-scale two-player extensive-form games.
By instantiating first-order methods with our regularizers, we develop the first accelerated first-order methods for computing correlated equilibra and ex-ante coordinated team equilibria.
arXiv Detail & Related papers (2021-05-27T06:10:24Z) - Zoetrope Genetic Programming for Regression [2.642406403099596]
The Zoetrope Genetic Programming (ZGP) algorithm is based on an original representation for mathematical expressions.
ZGP is validated using a large number of public domain regression datasets.
arXiv Detail & Related papers (2021-02-26T10:47:10Z) - Learning Symbolic Expressions: Mixed-Integer Formulations, Cuts, and
Heuristics [1.1602089225841632]
We consider the problem of learning a regression function without assuming its functional form.
We propose a that builds an expression tree by solving a restricted MI.
arXiv Detail & Related papers (2021-02-16T18:39:14Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.