SVTest: general purpose software for testing weakly random sources with exemplary application to seismic data analysis enabling quantum amplification
- URL: http://arxiv.org/abs/2504.06899v1
- Date: Wed, 09 Apr 2025 13:58:34 GMT
- Title: SVTest: general purpose software for testing weakly random sources with exemplary application to seismic data analysis enabling quantum amplification
- Authors: Maciej Stankiewicz, Roberto Salazar, Mikołaj Czechlewski, Alejandra Muñoz Jensen, Catalina Morales-Yáñez, Omer Sakarya, Julio Viveros Carrasco, Karol Horodecki,
- Abstract summary: Quantum devices can amplify the privacy of a weak source of randomness.<n>One of the theoretical models of such weak sources are the so-called Santha-Vazirani (SV) sources.<n>We develop software to estimate the parameter characterizing the source's randomness.
- Score: 34.82692226532414
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generating private randomness is essential for numerous applications ranging from security proofs to online banking. Consequently, the capacity of quantum devices to amplify the privacy of a weak source of randomness, in cases unattainable by classical methods, constitutes a practical advantage. One of the theoretical models of such weak sources are the so-called Santha-Vazirani (SV) sources; however, finding natural sources satisfying the SV model is a paramount challenge. In this article, we take three significant steps on the way to simplify this hard task. We begin with an in-depth analysis of the mathematical background for estimating the quality of a weak randomness source by providing a set of axioms that systematize the possible approaches to such estimation. We then develop software (SVTest) to estimate the parameter characterizing the source's randomness. The software is general-purpose, i.e., it can test the randomness of any target sequence of bits. Later, we apply the software to test seismic events (earthquakes and local noise) as potential sources of randomness. Our results demonstrate that seismic phenomena are possible sources of randomness, depending on the choice of discretization. Therefore, our work provides strong evidence of the potential of geophysical phenomena as a source of cryptographic resources, building an unprecedented bridge between both fields.
Related papers
- Delegated verification of quantum randomness with linear optics [0.0]
Quantum mechanics offers the best properties of an entropy source in terms of unpredictability.<n>We present a method that allows a third party to publicly perform statistical testing without compromising the confidentiality of the random bits.
arXiv Detail & Related papers (2024-11-29T13:11:01Z) - Efficient Quality Estimation of True Random Bit-streams [5.441027708840589]
This paper reports the implementation and characterization of an on-line procedure for the detection of anomalies in a true random bit stream.
The experimental validation of the approach is performed upon the bit streams generated by a quantum, silicon-based entropy source.
arXiv Detail & Related papers (2024-09-09T12:09:17Z) - To what extent are multiple pendulum systems viable in pseudo-random number generation? [0.0]
This paper explores the development and viability of an alternative pseudorandom number generator (PRNG)
Traditional PRNGs, notably the one implemented in the Java.Random class, suffer from predictability which gives rise to exploitability.
This study proposes a novel PRNG designed using ordinary differential equations, physics modeling, and chaos theory.
arXiv Detail & Related papers (2024-04-15T00:28:51Z) - Sourcerer: Sample-based Maximum Entropy Source Distribution Estimation [5.673617376471343]
We propose an approach which targets the maximum entropy distribution, i.e., prioritizes retaining as much uncertainty as possible.<n>Our method is purely sample-based - leveraging the Sliced-Wasserstein distance to measure the discrepancy between the dataset and simulations.<n>To demonstrate the utility of our approach, we infer source distributions for parameters of the Hodgkin-Huxley model from experimental datasets with thousands of single-neuron measurements.
arXiv Detail & Related papers (2024-02-12T17:13:02Z) - Decomposing Uncertainty for Large Language Models through Input Clarification Ensembling [69.83976050879318]
In large language models (LLMs), identifying sources of uncertainty is an important step toward improving reliability, trustworthiness, and interpretability.
In this paper, we introduce an uncertainty decomposition framework for LLMs, called input clarification ensembling.
Our approach generates a set of clarifications for the input, feeds them into an LLM, and ensembles the corresponding predictions.
arXiv Detail & Related papers (2023-11-15T05:58:35Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Testing randomness of series generated in Bell's experiment [62.997667081978825]
We use a toy fiber optic based setup to generate binary series, and evaluate their level of randomness according to Ville principle.
Series are tested with a battery of standard statistical indicators, Hurst, Kolmogorov complexity, minimum entropy, Takensarity dimension of embedding, and Augmented Dickey Fuller and Kwiatkowski Phillips Schmidt Shin to check station exponent.
The level of randomness of series obtained by applying Toeplitz extractor to rejected series is found to be indistinguishable from the level of non-rejected raw ones.
arXiv Detail & Related papers (2022-08-31T17:39:29Z) - Principled Knowledge Extrapolation with GANs [92.62635018136476]
We study counterfactual synthesis from a new perspective of knowledge extrapolation.
We show that an adversarial game with a closed-form discriminator can be used to address the knowledge extrapolation problem.
Our method enjoys both elegant theoretical guarantees and superior performance in many scenarios.
arXiv Detail & Related papers (2022-05-21T08:39:42Z) - Uncertainty Modeling for Out-of-Distribution Generalization [56.957731893992495]
We argue that the feature statistics can be properly manipulated to improve the generalization ability of deep learning models.
Common methods often consider the feature statistics as deterministic values measured from the learned features.
We improve the network generalization ability by modeling the uncertainty of domain shifts with synthesized feature statistics during training.
arXiv Detail & Related papers (2022-02-08T16:09:12Z) - Certified Randomness From Steering Using Sequential Measurements [0.0]
A single entangled two-qubit pure state can be used to produce arbitrary amounts of certified randomness.
Motivated by these difficulties in the device-independent setting, we consider the scenario of one-sided device independence.
We show how certain aspects of previous work can be adapted to this scenario and provide theoretical bounds on the amount of randomness which can be certified.
arXiv Detail & Related papers (2020-08-03T08:18:29Z) - Semi-Device-Independent Random Number Generation with Flexible
Assumptions [0.0]
We propose a new framework for semi-device-independent randomness certification using a source of trusted vacuum in the form of a signal shutter.
We experimentally demonstrate our protocol with a photonic setup and generate secure random bits under three different assumptions with varying degrees of security and resulting data rates.
arXiv Detail & Related papers (2020-02-27T18:05:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.