A Statistical Side-Channel Risk Model for Timing Variability in Lattice-Based Post-Quantum Cryptography
- URL: http://arxiv.org/abs/2512.22301v1
- Date: Fri, 26 Dec 2025 03:12:33 GMT
- Title: A Statistical Side-Channel Risk Model for Timing Variability in Lattice-Based Post-Quantum Cryptography
- Authors: Aayush Mainali, Sirjan Ghimire,
- Abstract summary: Timing side-channels are an important threat to cryptography that still needs to be addressed in implementations.<n> lattice-based schemes may produce secret-dependent timing variability with the help of complex arithmetic and control flow.<n>A scenario-based statistical risk model is proposed for timing leakage as a problem of distributional distinguishability under controlled execution conditions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Timing side-channels are an important threat to cryptography that still needs to be addressed in implementations, and the advent of post-quantum cryptography raises this issue because the lattice-based schemes may produce secret-dependent timing variability with the help of complex arithmetic and control flow. Since also real timing measurements are affected by environmental noise (e.g. scheduling effects, contention, heavy tailed delays), in this work a scenario-based statistical risk model is proposed for timing leakage as a problem of distributional distinguishability under controlled execution conditions. We synthesize traces for two secret classes in idle, jitter and loaded scenarios and for multiple leakage models and quantify leakage with Welch's t-test, KS distance, Cliff's delta, mutual information, and distribution overlap to combine in a TLRI like manner to obtain a consistent score for ranking scenarios. Across representative lattice-based KEM families (Kyber, Saber, Frodo), idle conditions generally have the best distinguishability, jitter and loaded conditions erode distinguishability through an increase in variance and increase in overlap; cache-index and branch-style leakage tends to give the highest risk signals, and faster schemes can have a higher peak risk given similar leakage assumptions, allowing reproducible comparisons at an early design stage, prior to platform-specific validation.
Related papers
- Contextual and Seasonal LSTMs for Time Series Anomaly Detection [49.50689313712684]
We propose a novel prediction-based framework named Contextual and Seasonal LSTMs (CS-LSTMs)<n>CS-LSTMs are built upon a noise decomposition strategy and jointly leverage contextual dependencies and seasonal patterns.<n>They consistently outperform state-of-the-art methods, highlighting their effectiveness and practical value in robust time series anomaly detection.
arXiv Detail & Related papers (2026-02-10T11:46:15Z) - SMKC: Sketch Based Kernel Correlation Images for Variable Cardinality Time Series Anomaly Detection [0.0]
In operational environments, monitoring systems frequently experience sensor churn.<n>We propose SMKC, a framework that decouples the dynamic input structure from the anomaly detector.<n>We find that a detector using random projections and nearest neighbors on the SMKC representation performs competitively with fully trained baselines.
arXiv Detail & Related papers (2026-01-28T21:15:11Z) - EVEREST: An Evidential, Tail-Aware Transformer for Rare-Event Time-Series Forecasting [4.551615447454767]
EVEREST is a transformer-based architecture for probabilistic rare-event forecasting.<n>It delivers calibrated predictions and tail-aware risk estimation.<n>It is applicable to high-stakes domains such as industrial monitoring, weather, and satellite diagnostics.
arXiv Detail & Related papers (2026-01-26T23:15:20Z) - Stratified Hazard Sampling: Minimal-Variance Event Scheduling for CTMC/DTMC Discrete Diffusion and Flow Models [0.0]
Stratified Hazard Sampling Sampling (SHS) models per-token edits as events driven by cumulative hazard (CTMC) or cumulative jump mass (DTMC)<n>SHS models per-token edits as events driven by cumulative hazard (CTMC) or cumulative jump mass (DTMC) and places events by stratifying this cumulative quantity.<n>We also introduce a phase-allocation variant for blacklist-style lexical constraints that prioritizes early edits at high-risk positions to mitigate late-masking artifacts.
arXiv Detail & Related papers (2026-01-06T08:19:02Z) - Controllable Probabilistic Forecasting with Stochastic Decomposition Layers [1.3995263206621]
We introduce Decomposition Layers (SDL) for converting deterministic machine learning weather models into ensemble systems.<n>SDL applies learned perturbations at three decoder scales through latent-driven modulation, per-pixel noise, and channel scaling.<n>When applied to WXFormer via transfer learning, SDL requires less than 2% of the computational cost needed to train the baseline model.
arXiv Detail & Related papers (2025-12-21T17:10:00Z) - RI-Loss: A Learnable Residual-Informed Loss for Time Series Forecasting [13.117430904377905]
Time series forecasting relies on predicting future values from historical data.<n>MSE has two fundamental weaknesses: its point-wise error fails to capture temporal relationships, and it does not account for inherent noise in the data.<n>We introduce the Residual-Informed Loss (RI-Loss), a novel objective function based on the Hilbert-Schmidt Independence Criterion (HSIC)
arXiv Detail & Related papers (2025-11-13T09:36:00Z) - Kairos: Towards Adaptive and Generalizable Time Series Foundation Models [27.076542021368056]
Time series foundation models (TSFMs) have emerged as a powerful paradigm for time series analysis.<n>We propose Kairos, a flexible TSFM framework that integrates a dynamic patching tokenizer and an instance-adaptive positional embedding.<n>Kairos achieves superior performance with much fewer parameters on two common zero-shot benchmarks.
arXiv Detail & Related papers (2025-09-30T06:02:26Z) - Impute-MACFM: Imputation based on Mask-Aware Flow Matching [1.9483189922830135]
Impute-MACFM is a conditional flow matching framework for tabular imputation.<n>It addresses missingness mechanisms, missing completely at random, missing at random, and missing not at random.<n>It builds trajectories only on missing entries while constraining predicted velocity to remain near zero on observed entries.
arXiv Detail & Related papers (2025-09-27T05:15:09Z) - COIN: Uncertainty-Guarding Selective Question Answering for Foundation Models with Provable Risk Guarantees [51.5976496056012]
COIN is an uncertainty-guarding selection framework that calibrates statistically valid thresholds to filter a single generated answer per question.<n>COIN estimates the empirical error rate on a calibration set and applies confidence interval methods to establish a high-probability upper bound on the true error rate.<n>We demonstrate COIN's robustness in risk control, strong test-time power in retaining admissible answers, and predictive efficiency under limited calibration data.
arXiv Detail & Related papers (2025-06-25T07:04:49Z) - Learning from Noisy Labels via Conditional Distributionally Robust Optimization [5.85767711644773]
crowdsourcing has emerged as a practical solution for labeling large datasets.
It presents a significant challenge in learning accurate models due to noisy labels from annotators with varying levels of expertise.
arXiv Detail & Related papers (2024-11-26T05:03:26Z) - When Does Confidence-Based Cascade Deferral Suffice? [69.28314307469381]
Cascades are a classical strategy to enable inference cost to vary adaptively across samples.
A deferral rule determines whether to invoke the next classifier in the sequence, or to terminate prediction.
Despite being oblivious to the structure of the cascade, confidence-based deferral often works remarkably well in practice.
arXiv Detail & Related papers (2023-07-06T04:13:57Z) - Robust Control for Dynamical Systems With Non-Gaussian Noise via Formal
Abstractions [59.605246463200736]
We present a novel controller synthesis method that does not rely on any explicit representation of the noise distributions.
First, we abstract the continuous control system into a finite-state model that captures noise by probabilistic transitions between discrete states.
We use state-of-the-art verification techniques to provide guarantees on the interval Markov decision process and compute a controller for which these guarantees carry over to the original control system.
arXiv Detail & Related papers (2023-01-04T10:40:30Z) - Probabilities Are Not Enough: Formal Controller Synthesis for Stochastic
Dynamical Models with Epistemic Uncertainty [68.00748155945047]
Capturing uncertainty in models of complex dynamical systems is crucial to designing safe controllers.
Several approaches use formal abstractions to synthesize policies that satisfy temporal specifications related to safety and reachability.
Our contribution is a novel abstraction-based controller method for continuous-state models with noise, uncertain parameters, and external disturbances.
arXiv Detail & Related papers (2022-10-12T07:57:03Z) - Sampling-Based Robust Control of Autonomous Systems with Non-Gaussian
Noise [59.47042225257565]
We present a novel planning method that does not rely on any explicit representation of the noise distributions.
First, we abstract the continuous system into a discrete-state model that captures noise by probabilistic transitions between states.
We capture these bounds in the transition probability intervals of a so-called interval Markov decision process (iMDP)
arXiv Detail & Related papers (2021-10-25T06:18:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.