Shift-invariant functions and almost liftings
- URL: http://arxiv.org/abs/2407.11931v1
- Date: Tue, 16 Jul 2024 17:23:27 GMT
- Title: Shift-invariant functions and almost liftings
- Authors: Jan Kristian Haugland, Tron Omland,
- Abstract summary: We investigate shift-invariant vectorial Boolean functions on $n$bits that are lifted from Boolean functions on $k$bits, for $kleq n$.
We show that if a Boolean function with diameter $k$ is an almost lifting, the maximum number of collisions of its lifted functions is $2k-1$ for any $n$.
We search for functions in the class of almost liftings that have good cryptographic properties and for which the non-bijectivity does not cause major security weaknesses.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate shift-invariant vectorial Boolean functions on $n$~bits that are lifted from Boolean functions on $k$~bits, for $k\leq n$. We consider vectorial functions that are not necessarily permutations, but are, in some sense, almost bijective. In this context, we define an almost lifting as a Boolean function for which there is an upper bound on the number of collisions of its lifted functions that does not depend on $n$. We show that if a Boolean function with diameter $k$ is an almost lifting, then the maximum number of collisions of its lifted functions is $2^{k-1}$ for any $n$. Moreover, we search for functions in the class of almost liftings that have good cryptographic properties and for which the non-bijectivity does not cause major security weaknesses. These functions generalize the well-known map $\chi$ used in the Keccak hash function.
Related papers
- New classes of reversible cellular automata [0.0]
A shift-invariant vectorial Boolean function $F$ induces a proper lifting for every $ngeq k$.
We construct new families of such liftings for arbitrary large $k$ and discuss whether all have been identified for $kleq 6$.
arXiv Detail & Related papers (2024-11-01T16:33:53Z) - Embeddings between Barron spaces with higher order activation functions [1.0999592665107414]
We study embeddings between Barron spaces with different activation functions.
An activation function of particular interest is the rectified power unit ($operatornameRePU$) given by $operatornameRePU_s(x)=max(0,x)s$.
arXiv Detail & Related papers (2023-05-25T08:31:59Z) - On Symmetric Pseudo-Boolean Functions: Factorization, Kernels and
Applications [0.0]
We prove that any symmetric pseudo-Boolean function can be equivalently expressed as a power series or factorized.
We use these results to analyze symmetric pseudo-Boolean functions appearing in the literature of spin glass energy functions, quantum information and tensor networks.
arXiv Detail & Related papers (2022-09-29T18:00:07Z) - Dueling Convex Optimization with General Preferences [85.14061196945599]
We address the problem of emph optimization with dueling feedback, where the goal is to minimize a convex function given a weaker form of emphilonling feedback.
Our main contribution is an efficient algorithm with convergence $smashwidetilde O(epsilon-4p)$ for a smooth convex objective function, and an efficient $smashwidetilde O(epsilon-2p) when the objective is smooth and convex.
arXiv Detail & Related papers (2022-09-27T11:10:41Z) - Exponential Separation between Quantum and Classical Ordered Binary
Decision Diagrams, Reordering Method and Hierarchies [68.93512627479197]
We study quantum Ordered Binary Decision Diagrams($OBDD$) model.
We prove lower bounds and upper bounds for OBDD with arbitrary order of input variables.
We extend hierarchy for read$k$-times Ordered Binary Decision Diagrams ($k$-OBDD$) of width.
arXiv Detail & Related papers (2022-04-22T12:37:56Z) - Convexity of a certain operator trace functional [1.1470070927586014]
In this article the operator trace function $ Lambda_r,s(A)[K, M] := operatornametr(K*Ar M Ar K)s$ is introduced and its convexity and concavity properties are investigated.
This function has a direct connection to several well-studied operator trace functions that appear in quantum information theory.
arXiv Detail & Related papers (2021-09-23T17:51:46Z) - Submodular + Concave [53.208470310734825]
It has been well established that first order optimization methods can converge to the maximal objective value of concave functions.
In this work, we initiate the determinant of the smooth functions convex body $$F(x) = G(x) +C(x)$.
This class of functions is an extension of both concave and continuous DR-submodular functions for which no guarantee is known.
arXiv Detail & Related papers (2021-06-09T01:59:55Z) - Neural networks with superexpressive activations and integer weights [91.3755431537592]
An example of an activation function $sigma$ is given such that networks with activations $sigma, lfloorcdotrfloor$, integer weights and a fixed architecture is given.
The range of integer weights required for $varepsilon$-approximation of H"older continuous functions is derived.
arXiv Detail & Related papers (2021-05-20T17:29:08Z) - Size and Depth Separation in Approximating Natural Functions with Neural
Networks [52.73592689730044]
We show the benefits of size and depth for approximation of natural functions with ReLU networks.
We show a complexity-theoretic barrier to proving such results beyond size $O(d)$.
We also show an explicit natural function, that can be approximated with networks of size $O(d)$.
arXiv Detail & Related papers (2021-01-30T21:30:11Z) - On the Modularity of Hypernetworks [103.1147622394852]
We show that for a structured target function, the overall number of trainable parameters in a hypernetwork is smaller by orders of magnitude than the number of trainable parameters of a standard neural network and an embedding method.
arXiv Detail & Related papers (2020-02-23T22:51:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.