Evolutionary Approach to S-box Generation: Optimizing Nonlinear Substitutions in Symmetric Ciphers
- URL: http://arxiv.org/abs/2407.03510v1
- Date: Wed, 3 Jul 2024 21:15:24 GMT
- Title: Evolutionary Approach to S-box Generation: Optimizing Nonlinear Substitutions in Symmetric Ciphers
- Authors: Oleksandr Kuznetsov, Nikolay Poluyanenko, Emanuele Frontoni, Marco Arnesano, Oleksii Smirnov,
- Abstract summary: This study explores the application of genetic algorithms in generating highly nonlinear substitution boxes (S-boxes) for symmetric key cryptography.
We present a novel implementation that combines a genetic algorithm with the Walsh-Hadamard Spectrum (WHS) cost function to produce 8x8 S-boxes with a nonlinearity of 104.
Our approach achieves performance parity with the best-known methods, requiring an average of 49,399 iterations with a 100% success rate.
- Score: 19.65010496906351
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This study explores the application of genetic algorithms in generating highly nonlinear substitution boxes (S-boxes) for symmetric key cryptography. We present a novel implementation that combines a genetic algorithm with the Walsh-Hadamard Spectrum (WHS) cost function to produce 8x8 S-boxes with a nonlinearity of 104. Our approach achieves performance parity with the best-known methods, requiring an average of 49,399 iterations with a 100% success rate. The study demonstrates significant improvements over earlier genetic algorithm implementations in this field, reducing iteration counts by orders of magnitude. By achieving equivalent performance through a different algorithmic approach, our work expands the toolkit available to cryptographers and highlights the potential of genetic methods in cryptographic primitive generation. The adaptability and parallelization potential of genetic algorithms suggest promising avenues for future research in S-box generation, potentially leading to more robust, efficient, and innovative cryptographic systems. Our findings contribute to the ongoing evolution of symmetric key cryptography, offering new perspectives on optimizing critical components of secure communication systems.
Related papers
- MeGA: Merging Multiple Independently Trained Neural Networks Based on Genetic Algorithm [0.0]
We introduce a novel method for merging the weights of multiple pre-trained neural networks using a genetic algorithm called MeGA.
Our approach leverages a genetic algorithm with tournament selection, crossover, and mutation to optimize weight combinations, creating a more effective fusion.
arXiv Detail & Related papers (2024-06-07T03:31:58Z) - Genetic Engineering Algorithm (GEA): An Efficient Metaheuristic
Algorithm for Solving Combinatorial Optimization Problems [1.8434042562191815]
Genetic Algorithms (GAs) are known for their efficiency in solving optimization problems.
This paper proposes a new metaheuristic algorithm called the Genetic Engineering Algorithm (GEA) that draws inspiration from genetic engineering concepts.
arXiv Detail & Related papers (2023-09-28T13:05:30Z) - GloptiNets: Scalable Non-Convex Optimization with Certificates [61.50835040805378]
We present a novel approach to non-cube optimization with certificates, which handles smooth functions on the hypercube or on the torus.
By exploiting the regularity of the target function intrinsic in the decay of its spectrum, we allow at the same time to obtain precise certificates and leverage the advanced and powerful neural networks.
arXiv Detail & Related papers (2023-06-26T09:42:59Z) - Stochastic Ratios Tracking Algorithm for Large Scale Machine Learning
Problems [0.7614628596146599]
We propose a novel algorithm for adaptive step length selection in the classical SGD framework.
Under reasonable conditions, the algorithm produces step lengths in line with well-established theoretical requirements.
We show that the algorithm can generate step lengths comparable to the best step length obtained from manual tuning.
arXiv Detail & Related papers (2023-05-17T06:22:11Z) - Genetically Modified Wolf Optimization with Stochastic Gradient Descent
for Optimising Deep Neural Networks [0.0]
This research aims to analyze an alternative approach to optimizing neural network (NN) weights, with the use of population-based metaheuristic algorithms.
A hybrid between Grey Wolf (GWO) and Genetic Modified Algorithms (GA) is explored, in conjunction with Gradient Descent (SGD)
This algorithm allows for a combination between exploitation and exploration, whilst also tackling the issue of high-dimensionality.
arXiv Detail & Related papers (2023-01-21T13:22:09Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Distributed Evolution Strategies for Black-box Stochastic Optimization [42.90600124972943]
This work concerns the evolutionary approaches to distributed black-box optimization.
Each worker can individually solve an approximation of the problem with algorithms.
We propose two alternative simulation schemes which significantly improve robustness of problems.
arXiv Detail & Related papers (2022-04-09T11:18:41Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - AdaLead: A simple and robust adaptive greedy search algorithm for
sequence design [55.41644538483948]
We develop an easy-to-directed, scalable, and robust evolutionary greedy algorithm (AdaLead)
AdaLead is a remarkably strong benchmark that out-competes more complex state of the art approaches in a variety of biologically motivated sequence design challenges.
arXiv Detail & Related papers (2020-10-05T16:40:38Z) - A Dynamical Systems Approach for Convergence of the Bayesian EM
Algorithm [59.99439951055238]
We show how (discrete-time) Lyapunov stability theory can serve as a powerful tool to aid, or even lead, in the analysis (and potential design) of optimization algorithms that are not necessarily gradient-based.
The particular ML problem that this paper focuses on is that of parameter estimation in an incomplete-data Bayesian framework via the popular optimization algorithm known as maximum a posteriori expectation-maximization (MAP-EM)
We show that fast convergence (linear or quadratic) is achieved, which could have been difficult to unveil without our adopted S&C approach.
arXiv Detail & Related papers (2020-06-23T01:34:18Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.