RULSurv: A probabilistic survival-based method for early censoring-aware prediction of remaining useful life in ball bearings
- URL: http://arxiv.org/abs/2405.01614v3
- Date: Mon, 14 Apr 2025 11:57:40 GMT
- Title: RULSurv: A probabilistic survival-based method for early censoring-aware prediction of remaining useful life in ball bearings
- Authors: Christian Marius Lillelund, Fernando Pannullo, Morten Opprud Jakobsen, Manuel Morante, Christian Fischer Pedersen,
- Abstract summary: We introduce a novel and flexible method for early fault detection using Kullback-Leibler divergence and RUL estimation.<n>We demonstrate our approach in the XJTU-SY dataset using a 5-fold cross-validation strategy across three different operating conditions.<n>Our approach achieves a mean cumulative relative accuracy (CRA) of 0.7586 over 5 bearings under the highest load, which improves over several state-of-the-art baselines.
- Score: 39.58317527488534
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Predicting the remaining useful life (RUL) of ball bearings is an active area of research, where novel machine learning techniques are continuously being applied to predict degradation trends and anticipate failures before they occur. However, few studies have explicitly addressed the challenge of handling censored data, where information about a specific event (\eg mechanical failure) is incomplete or only partially observed. To address this issue, we introduce a novel and flexible method for early fault detection using Kullback-Leibler (KL) divergence and RUL estimation using survival analysis that naturally supports censored data. We demonstrate our approach in the XJTU-SY dataset using a 5-fold cross-validation strategy across three different operating conditions. When predicting the time to failure for bearings under the highest load (C1, 12.0 kN and 2100 RPM) with 25% random censoring, our approach achieves a mean absolute error (MAE) of 14.7 minutes (95% CI = 13.6-15.8) using a linear CoxPH model, and an MAE of 12.6 minutes (95% CI = 11.8-13.4) using a nonlinear Random Survival Forests model, compared to an MAE of 18.5 minutes (95% CI = 17.4-19.6) using a linear LASSO model that does not support censoring. Moreover, our approach achieves a mean cumulative relative accuracy (CRA) of 0.7586 over 5 bearings under the highest load, which improves over several state-of-the-art baselines. Our work highlights the importance of considering censored data as part of the model design when building predictive models for early fault detection and RUL estimation.
Related papers
- Advancing Tabular Stroke Modelling Through a Novel Hybrid Architecture and Feature-Selection Synergy [0.9999629695552196]
The present work develops and validates a data-driven and interpretable machine-learning framework designed to predict strokes.<n>Ten routinely gathered demographic, lifestyle, and clinical variables were sourced from a public cohort of 4,981 records.<n>The proposed model achieved an accuracy rate of 97.2% and an F1-score of 97.15%, indicating a significant enhancement compared to the leading individual model.
arXiv Detail & Related papers (2025-05-18T21:46:45Z) - Impact of Comprehensive Data Preprocessing on Predictive Modelling of COVID-19 Mortality [0.0]
This study evaluates the impact of a custom data preprocessing pipeline on ten machine learning models predicting COVID-19 mortality.
Our pipeline differs from a standard preprocessing pipeline through four key steps.
arXiv Detail & Related papers (2024-08-15T13:23:59Z) - TripleSurv: Triplet Time-adaptive Coordinate Loss for Survival Analysis [15.496918127515665]
We propose a time-adaptive coordinate loss function, TripleSurv, to handle the complexities of learning process and exploit valuable survival time values.
Our TripleSurv is evaluated on three real-world survival datasets and a public synthetic dataset.
arXiv Detail & Related papers (2024-01-05T08:37:57Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - Predicting Survival Time of Ball Bearings in the Presence of Censoring [44.99833362998488]
We propose a novel approach to predict the time to failure in ball bearings using survival analysis.
We analyze bearing data in the frequency domain and annotate when a bearing fails by comparing the Kullback-Leibler divergence and the standard deviation.
We train several survival models to estimate the time to failure based on the annotated data.
arXiv Detail & Related papers (2023-09-13T08:30:31Z) - CenTime: Event-Conditional Modelling of Censoring in Survival Analysis [49.44664144472712]
We introduce CenTime, a novel approach to survival analysis that directly estimates the time to event.
Our method features an innovative event-conditional censoring mechanism that performs robustly even when uncensored data is scarce.
Our results indicate that CenTime offers state-of-the-art performance in predicting time-to-death while maintaining comparable ranking performance.
arXiv Detail & Related papers (2023-09-07T17:07:33Z) - Copula-Based Deep Survival Models for Dependent Censoring [10.962520289040336]
This paper presents a parametric model of survival that extends modern non-linear survival analysis by relaxing the assumption of conditional independence.
On synthetic and semi-synthetic data, our approach significantly improves estimates of survival distributions compared to the standard that assumes conditional independence in the data.
arXiv Detail & Related papers (2023-06-20T21:51:13Z) - Conservative Prediction via Data-Driven Confidence Minimization [70.93946578046003]
In safety-critical applications of machine learning, it is often desirable for a model to be conservative.
We propose the Data-Driven Confidence Minimization framework, which minimizes confidence on an uncertainty dataset.
arXiv Detail & Related papers (2023-06-08T07:05:36Z) - Leveraging Unlabeled Data to Predict Out-of-Distribution Performance [63.740181251997306]
Real-world machine learning deployments are characterized by mismatches between the source (training) and target (test) distributions.
In this work, we investigate methods for predicting the target domain accuracy using only labeled source data and unlabeled target data.
We propose Average Thresholded Confidence (ATC), a practical method that learns a threshold on the model's confidence, predicting accuracy as the fraction of unlabeled examples.
arXiv Detail & Related papers (2022-01-11T23:01:12Z) - Imputation-Free Learning from Incomplete Observations [73.15386629370111]
We introduce the importance of guided gradient descent (IGSGD) method to train inference from inputs containing missing values without imputation.
We employ reinforcement learning (RL) to adjust the gradients used to train the models via back-propagation.
Our imputation-free predictions outperform the traditional two-step imputation-based predictions using state-of-the-art imputation methods.
arXiv Detail & Related papers (2021-07-05T12:44:39Z) - Calibration of prediction rules for life-time outcomes using prognostic
Cox regression survival models and multiple imputations to account for
missing predictor data with cross-validatory assessment [0.0]
Methods are described to combine imputation with predictive calibration in survival modeling subject to censoring.
Prediction-averaging appears to have superior statistical properties, especially smaller predictive variation, as opposed to a direct application of Rubin's rules.
arXiv Detail & Related papers (2021-05-04T20:10:12Z) - Conformalized Survival Analysis [6.92027612631023]
Existing survival analysis techniques heavily rely on strong modelling assumptions.
We develop an inferential method based on ideas from conformal prediction.
The validity and efficiency of our procedure are demonstrated on synthetic data and real COVID-19 data from the UK Biobank.
arXiv Detail & Related papers (2021-03-17T16:32:26Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.