TNANet: A Temporal-Noise-Aware Neural Network for Suicidal Ideation
Prediction with Noisy Physiological Data
- URL: http://arxiv.org/abs/2401.12733v1
- Date: Tue, 23 Jan 2024 13:11:05 GMT
- Title: TNANet: A Temporal-Noise-Aware Neural Network for Suicidal Ideation
Prediction with Noisy Physiological Data
- Authors: Niqi Liu, Fang Liu, Wenqi Ji, Xinxin Du, Xu Liu, Guozhen Zhao, Wenting
Mu, Yong-Jin Liu
- Abstract summary: We introduce a novel neural network model for analyzing noisy physiological time-series data, named TNANet.
Our TNANet achieves the prediction accuracy of 63.33% in a binary classification task, outperforming state-of-the-art models.
- Score: 21.1401772311666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The robust generalization of deep learning models in the presence of inherent
noise remains a significant challenge, especially when labels are subjective
and noise is indiscernible in natural settings. This problem is particularly
pronounced in many practical applications. In this paper, we address a special
and important scenario of monitoring suicidal ideation, where time-series data,
such as photoplethysmography (PPG), is susceptible to such noise. Current
methods predominantly focus on image and text data or address artificially
introduced noise, neglecting the complexities of natural noise in time-series
analysis. To tackle this, we introduce a novel neural network model tailored
for analyzing noisy physiological time-series data, named TNANet, which merges
advanced encoding techniques with confidence learning, enhancing prediction
accuracy. Another contribution of our work is the collection of a specialized
dataset of PPG signals derived from real-world environments for suicidal
ideation prediction. Employing this dataset, our TNANet achieves the prediction
accuracy of 63.33% in a binary classification task, outperforming
state-of-the-art models. Furthermore, comprehensive evaluations were conducted
on three other well-known public datasets with artificially introduced noise to
rigorously test the TNANet's capabilities. These tests consistently
demonstrated TNANet's superior performance by achieving an accuracy improvement
of more than 10% compared to baseline methods.
Related papers
- Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Understanding and Mitigating the Label Noise in Pre-training on
Downstream Tasks [91.15120211190519]
This paper aims to understand the nature of noise in pre-training datasets and to mitigate its impact on downstream tasks.
We propose a light-weight black-box tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise.
arXiv Detail & Related papers (2023-09-29T06:18:15Z) - DiffSTG: Probabilistic Spatio-Temporal Graph Forecasting with Denoising
Diffusion Models [53.67562579184457]
This paper focuses on probabilistic STG forecasting, which is challenging due to the difficulty in modeling uncertainties and complex dependencies.
We present the first attempt to generalize the popular denoising diffusion models to STGs, leading to a novel non-autoregressive framework called DiffSTG.
Our approach combines the intrinsic-temporal learning capabilities STNNs with the uncertainty measurements of diffusion models.
arXiv Detail & Related papers (2023-01-31T13:42:36Z) - Transfer learning for self-supervised, blind-spot seismic denoising [0.0]
We propose an initial, supervised training of the network on a frugally-generated synthetic dataset prior to fine-tuning in a self-supervised manner on the field dataset of interest.
Considering the change in peak signal-to-noise ratio, as well as the volume of noise reduced and signal leakage observed, we illustrate the clear benefit in initialising the self-supervised network with the weights from a supervised base-training.
arXiv Detail & Related papers (2022-09-25T12:58:10Z) - Pre-training via Denoising for Molecular Property Prediction [53.409242538744444]
We describe a pre-training technique that utilizes large datasets of 3D molecular structures at equilibrium.
Inspired by recent advances in noise regularization, our pre-training objective is based on denoising.
arXiv Detail & Related papers (2022-05-31T22:28:34Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - The potential of self-supervised networks for random noise suppression
in seismic data [0.0]
Blind-spot networks are shown to be an efficient suppressor of random noise in seismic data.
Results are compared with two commonly used random denoising techniques: FX-deconvolution and Curvelet transform.
We believe this is just the beginning of utilising self-supervised learning in seismic applications.
arXiv Detail & Related papers (2021-09-15T14:57:43Z) - Attribute-Guided Adversarial Training for Robustness to Natural
Perturbations [64.35805267250682]
We propose an adversarial training approach which learns to generate new samples so as to maximize exposure of the classifier to the attributes-space.
Our approach enables deep neural networks to be robust against a wide range of naturally occurring perturbations.
arXiv Detail & Related papers (2020-12-03T10:17:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.