Predicting discrete-time bifurcations with deep learning
- URL: http://arxiv.org/abs/2303.09669v2
- Date: Thu, 8 Feb 2024 20:59:42 GMT
- Title: Predicting discrete-time bifurcations with deep learning
- Authors: Thomas M. Bury, Daniel Dylewsky, Chris T. Bauch, Madhur Anand, Leon
Glass, Alvin Shrier, Gil Bub
- Abstract summary: We train a deep learning classifier to provide an EWS for the five local discrete-time bifurcations of codimension-1.
It outperforms commonly used EWS under a wide range of noise intensities and rates of approach to the bifurcation.
It also predicts the correct bifurcation in most cases, with particularly high accuracy for the period-doubling, Neimark-Sacker and fold bifurcations.
- Score: 0.3350491650545292
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many natural and man-made systems are prone to critical transitions -- abrupt
and potentially devastating changes in dynamics. Deep learning classifiers can
provide an early warning signal (EWS) for critical transitions by learning
generic features of bifurcations (dynamical instabilities) from large simulated
training data sets. So far, classifiers have only been trained to predict
continuous-time bifurcations, ignoring rich dynamics unique to discrete-time
bifurcations. Here, we train a deep learning classifier to provide an EWS for
the five local discrete-time bifurcations of codimension-1. We test the
classifier on simulation data from discrete-time models used in physiology,
economics and ecology, as well as experimental data of spontaneously beating
chick-heart aggregates that undergo a period-doubling bifurcation. The
classifier outperforms commonly used EWS under a wide range of noise
intensities and rates of approach to the bifurcation. It also predicts the
correct bifurcation in most cases, with particularly high accuracy for the
period-doubling, Neimark-Sacker and fold bifurcations. Deep learning as a tool
for bifurcation prediction is still in its nascence and has the potential to
transform the way we monitor systems for critical transitions.
Related papers
- Learning from the past: predicting critical transitions with machine learning trained on surrogates of historical data [3.9617282900065853]
Complex systems can undergo critical transitions, where slowly changing environmental conditions trigger a sudden shift to a new, potentially catastrophic state.
Early warning signals for these events are crucial for decision-making in fields such as ecology, biology and climate science.
We introduce an approach that trains machine learning classifiers directly on surrogate data of past transitions.
arXiv Detail & Related papers (2024-10-13T03:25:49Z) - DCAST: Diverse Class-Aware Self-Training Mitigates Selection Bias for Fairer Learning [0.0]
bias unascribed to sensitive features is challenging to identify and typically goes undiagnosed.
Strategies to mitigate unidentified bias and evaluate mitigation methods are crucially needed, yet remain underexplored.
We introduce Diverse Class-Aware Self-Training (DCAST), model-agnostic mitigation aware of class-specific bias.
arXiv Detail & Related papers (2024-09-30T09:26:19Z) - Bias in Motion: Theoretical Insights into the Dynamics of Bias in SGD Training [7.5041863920639456]
Machine learning systems often acquire biases by leveraging undesired features in the data, impacting accuracy across different sub-populations.
This paper explores the evolution of bias in a teacher-student setup modeling different data sub-populations with a Gaussian-mixture model.
Applying our findings to fairness and robustness, we delineate how and when heterogeneous data and spurious features can generate and amplify bias.
arXiv Detail & Related papers (2024-05-28T15:50:10Z) - Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - FedUV: Uniformity and Variance for Heterogeneous Federated Learning [5.9330433627374815]
Federated learning is a promising framework to train neural networks with widely distributed data.
Recent work has shown this is due to the final layer of the network being most prone to local bias.
We investigate the training dynamics of the classifier by applying SVD to the weights motivated by the observation that freezing weights results in constant singular values.
arXiv Detail & Related papers (2024-02-27T15:53:15Z) - Data Attribution for Diffusion Models: Timestep-induced Bias in Influence Estimation [53.27596811146316]
Diffusion models operate over a sequence of timesteps instead of instantaneous input-output relationships in previous contexts.
We present Diffusion-TracIn that incorporates this temporal dynamics and observe that samples' loss gradient norms are highly dependent on timestep.
We introduce Diffusion-ReTrac as a re-normalized adaptation that enables the retrieval of training samples more targeted to the test sample of interest.
arXiv Detail & Related papers (2024-01-17T07:58:18Z) - Learning noise-induced transitions by multi-scaling reservoir computing [2.9170682727903863]
We develop a machine learning model, reservoir computing as a type of recurrent neural network, to learn noise-induced transitions.
The trained model generates accurate statistics of transition time and the number of transitions.
It is also aware of the asymmetry of the double-well potential, the rotational dynamics caused by non-detailed balance, and transitions in multi-stable systems.
arXiv Detail & Related papers (2023-09-11T12:26:36Z) - Prototypical Classifier for Robust Class-Imbalanced Learning [64.96088324684683]
We propose textitPrototypical, which does not require fitting additional parameters given the embedding network.
Prototypical produces balanced and comparable predictions for all classes even though the training set is class-imbalanced.
We test our method on CIFAR-10LT, CIFAR-100LT and Webvision datasets, observing that Prototypical obtains substaintial improvements compared with state of the arts.
arXiv Detail & Related papers (2021-10-22T01:55:01Z) - Bias-Variance Tradeoffs in Single-Sample Binary Gradient Estimators [100.58924375509659]
Straight-through (ST) estimator gained popularity due to its simplicity and efficiency.
Several techniques were proposed to improve over ST while keeping the same low computational complexity.
We conduct a theoretical analysis of Bias and Variance of these methods in order to understand tradeoffs and verify originally claimed properties.
arXiv Detail & Related papers (2021-10-07T15:16:07Z) - Class Balancing GAN with a Classifier in the Loop [58.29090045399214]
We introduce a novel theoretically motivated Class Balancing regularizer for training GANs.
Our regularizer makes use of the knowledge from a pre-trained classifier to ensure balanced learning of all the classes in the dataset.
We demonstrate the utility of our regularizer in learning representations for long-tailed distributions via achieving better performance than existing approaches over multiple datasets.
arXiv Detail & Related papers (2021-06-17T11:41:30Z) - Meta Transition Adaptation for Robust Deep Learning with Noisy Labels [61.8970957519509]
This study proposes a new meta-transition-learning strategy for the task.
Specifically, through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated.
Our method can more accurately extract the transition matrix, naturally following its more robust performance than prior arts.
arXiv Detail & Related papers (2020-06-10T07:27:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.