Feature-aware Modulation for Learning from Temporal Tabular Data
- URL: http://arxiv.org/abs/2512.03678v1
- Date: Wed, 03 Dec 2025 11:13:12 GMT
- Title: Feature-aware Modulation for Learning from Temporal Tabular Data
- Authors: Hao-Run Cai, Han-Jia Ye,
- Abstract summary: temporal distribution shifts pose significant challenges in real-world deployment.<n> Static models assume fixed mappings to ensure generalization, whereas adaptive models may overfit to transient patterns.<n>We propose a feature-aware temporal modulation mechanism that conditions feature representations on temporal context.
- Score: 47.36022303401642
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: While tabular machine learning has achieved remarkable success, temporal distribution shifts pose significant challenges in real-world deployment, as the relationships between features and labels continuously evolve. Static models assume fixed mappings to ensure generalization, whereas adaptive models may overfit to transient patterns, creating a dilemma between robustness and adaptability. In this paper, we analyze key factors essential for constructing an effective dynamic mapping for temporal tabular data. We discover that evolving feature semantics-particularly objective and subjective meanings-introduce concept drift over time. Crucially, we identify that feature transformation strategies are able to mitigate discrepancies in feature representations across temporal stages. Motivated by these insights, we propose a feature-aware temporal modulation mechanism that conditions feature representations on temporal context, modulating statistical properties such as scale and skewness. By aligning feature semantics across time, our approach achieves a lightweight yet powerful adaptation, effectively balancing generalizability and adaptability. Benchmark evaluations validate the effectiveness of our method in handling temporal shifts in tabular data.
Related papers
- MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models [51.506429027626005]
Memory for Time Series (MEMTS) is a lightweight and plug-and-play method for retrieval-free domain adaptation in time series forecasting.<n>Key component of MEMTS is a Knowledge Persistence Module (KPM), which internalizes domain-specific temporal dynamics.<n>This paradigm shift enables MEMTS to achieve accurate domain adaptation with constant-time inference and near-zero latency.
arXiv Detail & Related papers (2026-02-14T14:00:06Z) - Patch-Level Tokenization with CNN Encoders and Attention for Improved Transformer Time-Series Forecasting [0.0]
This paper proposes a two-stage forecasting framework that separates local temporal representation learning from global dependency modelling.<n>A convolutional neural network operates on fixed-length temporal patches to extract short-range temporal dynamics and non-linear feature interactions.<n> Token-level self-attention is applied during representation learning to refine these embeddings, after which a Transformer encoder models inter-patch temporal dependencies to generate forecasts.
arXiv Detail & Related papers (2026-01-18T16:16:01Z) - Learning Time in Static Classifiers [44.358377952850994]
We propose a simple yet effective framework that equips standard feedforward classifiers with temporal reasoning.<n>We use a novel Support-Exemplar-Query (SEQ) learning paradigm, which structures training data into temporally coherent trajectories.<n>Our approach bridges static and temporal learning in a modular and data-efficient manner, requiring only a simple on top of pre-extracted features.
arXiv Detail & Related papers (2025-11-15T18:42:51Z) - Estimating Time Series Foundation Model Transferability via In-Context Learning [74.65355820906355]
Time series foundation models (TSFMs) offer strong zero-shot forecasting via large-scale pre-training.<n>Fine-tuning remains critical for boosting performance in domains with limited public data.<n>We introduce TimeTic, a transferability estimation framework that recasts model selection as an in-context-learning problem.
arXiv Detail & Related papers (2025-09-28T07:07:13Z) - Learning Temporal Saliency for Time Series Forecasting with Cross-Scale Attention [5.992220383989106]
We present CrossScaleNet, an innovative architecture that combines a patch-based cross-attention mechanism with multi-scale processing.<n>Our evaluations demonstrate superior performance in both temporal saliency detection and forecasting accuracy.
arXiv Detail & Related papers (2025-09-26T18:43:51Z) - Multiresolution Analysis and Statistical Thresholding on Dynamic Networks [49.09073800467438]
ANIE (Adaptive Network Intensity Estimation) is a multi-resolution framework designed to automatically identify the time scales at which network structure evolves.<n>We show that ANIE adapts to the appropriate time resolution and is able to capture sharp structural changes while remaining robust to noise.
arXiv Detail & Related papers (2025-06-01T22:55:55Z) - Robust Multi-Modal Forecasting: Integrating Static and Dynamic Features [0.0]
Time series forecasting plays a crucial role in various applications, particularly in healthcare.<n> Ensuring transparency and explainability of the models responsible for these tasks is essential for their adoption in critical settings.<n>Recent work has explored a top-down approach to bi-level transparency, focusing on understanding trends and properties of predicted time series.
arXiv Detail & Related papers (2025-05-21T04:12:12Z) - Dynamic Perturbed Adaptive Method for Infinite Task-Conflicting Time Series [0.0]
We formulate time series tasks as input-output mappings under varying objectives, where the same input may yield different outputs.<n>To study this, we construct a synthetic dataset with numerous conflicting subtasks to evaluate adaptation under frequent task shifts.<n>We propose a dynamic perturbed adaptive method based on a trunk-branch architecture, where the trunk evolves slowly to capture long-term structure.
arXiv Detail & Related papers (2025-05-17T08:33:57Z) - Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data [0.25322020135765466]
Quasi Zigzag Persistent Homology (QZPH) is a framework for analyzing time-varying data.<n>We introduce a stable topological invariant that captures both static and dynamic features at different scales.
arXiv Detail & Related papers (2025-02-22T02:53:26Z) - Revisiting Dynamic Evaluation: Online Adaptation for Large Language
Models [88.47454470043552]
We consider the problem of online fine tuning the parameters of a language model at test time, also known as dynamic evaluation.
Online adaptation turns parameters into temporally changing states and provides a form of context-length extension with memory in weights.
arXiv Detail & Related papers (2024-03-03T14:03:48Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.