Digging Deeper: Operator Analysis for Optimizing Nonlinearity of Boolean
Functions
- URL: http://arxiv.org/abs/2302.05890v1
- Date: Sun, 12 Feb 2023 10:34:01 GMT
- Title: Digging Deeper: Operator Analysis for Optimizing Nonlinearity of Boolean
Functions
- Authors: Marko Djurasevic, Domagoj Jakobovic, Luca Mariot, Stjepan Picek
- Abstract summary: We investigate the effects of genetic operators for bit-string encoding in optimizing nonlinearity.
By observing the range of possible changes an operator can provide, one can use this information to design a more effective combination of genetic operators.
- Score: 8.382710169577447
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Boolean functions are mathematical objects with numerous applications in
domains like coding theory, cryptography, and telecommunications. Finding
Boolean functions with specific properties is a complex combinatorial
optimization problem where the search space grows super-exponentially with the
number of input variables. One common property of interest is the nonlinearity
of Boolean functions. Constructing highly nonlinear Boolean functions is
difficult as it is not always known what nonlinearity values can be reached in
practice. In this paper, we investigate the effects of the genetic operators
for bit-string encoding in optimizing nonlinearity. While several mutation and
crossover operators have commonly been used, the link between the genotype they
operate on and the resulting phenotype changes is mostly obscure. By observing
the range of possible changes an operator can provide, as well as relative
probabilities of specific transitions in the objective space, one can use this
information to design a more effective combination of genetic operators. The
analysis reveals interesting insights into operator effectiveness and indicates
how algorithm design may improve convergence compared to an operator-agnostic
genetic algorithm.
Related papers
- Operator Learning Using Random Features: A Tool for Scientific Computing [3.745868534225104]
Supervised operator learning centers on the use of training data to estimate maps between infinite-dimensional spaces.
This paper introduces the function-valued random features method.
It leads to a supervised operator learning architecture that is practical for nonlinear problems.
arXiv Detail & Related papers (2024-08-12T23:10:39Z) - A New Angle: On Evolving Rotation Symmetric Boolean Functions [32.90791284928444]
This paper uses several evolutionary algorithms to evolve rotation symmetric Boolean functions with different properties.
Surprisingly, we find bitstring and floating point encodings work better than the tree encoding.
arXiv Detail & Related papers (2023-11-20T16:16:45Z) - A Search for Nonlinear Balanced Boolean Functions by Leveraging
Phenotypic Properties [3.265773263570237]
We consider the problem of finding perfectly balanced Boolean functions with high non-linearity values.
Such functions have extensive applications in domains such as cryptography and error-correcting coding theory.
We provide an approach for finding such functions by a local search method that exploits the structure of the underlying problem.
arXiv Detail & Related papers (2023-06-15T15:16:19Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - Generative Adversarial Neural Operators [59.21759531471597]
We propose the generative adversarial neural operator (GANO), a generative model paradigm for learning probabilities on infinite-dimensional function spaces.
GANO consists of two main components, a generator neural operator and a discriminator neural functional.
We empirically study GANOs in controlled cases where both input and output functions are samples from GRFs and compare its performance to the finite-dimensional counterpart GAN.
arXiv Detail & Related papers (2022-05-06T05:12:22Z) - Evolving Constructions for Balanced, Highly Nonlinear Boolean Functions [37.84234862910533]
We show that genetic programming can evolve constructions resulting in balanced Boolean functions with high nonlinearity.
Our results show that GP can find constructions that generalize well, i.e., result in the required functions for multiple tested sizes.
Interestingly, the simplest solution found by GP is a particular case of the well-known indirect sum construction.
arXiv Detail & Related papers (2022-02-17T16:33:24Z) - Evolutionary Construction of Perfectly Balanced Boolean Functions [7.673465837624365]
We investigate the use of Genetic Programming (GP) and Genetic Algorithms (GA) to construct Boolean functions that satisfy a property, perfect balancedness, along with a good nonlinearity profile.
Surprisingly, the results show that GA with the weightwise balanced representation outperforms GP with the classical truth table phenotype in finding highly nonlinear WPB functions.
arXiv Detail & Related papers (2022-02-16T18:03:04Z) - Graph-adaptive Rectified Linear Unit for Graph Neural Networks [64.92221119723048]
Graph Neural Networks (GNNs) have achieved remarkable success by extending traditional convolution to learning on non-Euclidean data.
We propose Graph-adaptive Rectified Linear Unit (GReLU) which is a new parametric activation function incorporating the neighborhood information in a novel and efficient way.
We conduct comprehensive experiments to show that our plug-and-play GReLU method is efficient and effective given different GNN backbones and various downstream tasks.
arXiv Detail & Related papers (2022-02-13T10:54:59Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Category-Learning with Context-Augmented Autoencoder [63.05016513788047]
Finding an interpretable non-redundant representation of real-world data is one of the key problems in Machine Learning.
We propose a novel method of using data augmentations when training autoencoders.
We train a Variational Autoencoder in such a way, that it makes transformation outcome predictable by auxiliary network.
arXiv Detail & Related papers (2020-10-10T14:04:44Z) - Hardness of Random Optimization Problems for Boolean Circuits,
Low-Degree Polynomials, and Langevin Dynamics [78.46689176407936]
We show that families of algorithms fail to produce nearly optimal solutions with high probability.
For the case of Boolean circuits, our results improve the state-of-the-art bounds known in circuit complexity theory.
arXiv Detail & Related papers (2020-04-25T05:45:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.