Application of Common Spatial Patterns in Gravitational Waves Detection
- URL: http://arxiv.org/abs/2201.04086v1
- Date: Tue, 11 Jan 2022 17:23:31 GMT
- Title: Application of Common Spatial Patterns in Gravitational Waves Detection
- Authors: Damodar Dahal
- Abstract summary: We develop and apply a CSP algorithm to the problem of identifying whether a given epoch of multi-detector Gravitational Wave (GW) strains contains coalescenses.
We find that our pipeline is correctly able to detect 76 out of 82 confident events from Gravitational Wave Transient Catalog, using H1 and L1 strains, with a classification score of $93.72 pm 0.04%$ using $10 times 5$ cross validation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Common Spatial Patterns (CSP) is a feature extraction algorithm widely used
in Brain-Computer Interface (BCI) Systems for detecting Event-Related
Potentials (ERPs) in multi-channel magneto/electroencephalography (MEG/EEG)
time series data. In this article, we develop and apply a CSP algorithm to the
problem of identifying whether a given epoch of multi-detector Gravitational
Wave (GW) strains contains coalescenses. Paired with Signal Processing
techniques and a Logistic Regression classifier, we find that our pipeline is
correctly able to detect 76 out of 82 confident events from Gravitational Wave
Transient Catalog, using H1 and L1 strains, with a classification score of
$93.72 \pm 0.04\%$ using $10 \times 5$ cross validation. The false negative
events were: GW170817-v3, GW191219 163120-v1, GW200115 042309-v2, GW200210
092254-v1, GW200220 061928-v1, and GW200322 091133-v1.
Related papers
- A Neural Network-Based Search for Unmodeled Transients in LIGO-Virgo-KAGRA's Third Observing Run [0.5857761499059161]
This paper presents the results of a Neural Network (NN)-based search for short-duration gravitational-wave transients in data from the third observing run of LIGO, Virgo, and KAGRA.
The search targets unmodeled transients with durations of milliseconds to a few seconds in the 30-1500 Hz frequency band, without assumptions about the incoming signal direction, polarization, or morphology.
arXiv Detail & Related papers (2024-12-27T19:00:01Z) - Unsupervised Learning Approach to Anomaly Detection in Gravitational Wave Data [0.0]
We propose an unsupervised anomaly detection method using variational autoencoders (VAEs) to analyze Gravitational waves (GW) data.
VAEs accurately reconstructs noise inputs while failing to reconstruct anomalies, such as GW signals, which results in measurable spikes in the reconstruction error.
This study introduces VAEs as a robust, unsupervised approach for identifying anomalies in GW data, which offers a scalable framework for detecting known and potentially new phenomena in physics.
arXiv Detail & Related papers (2024-11-29T03:18:40Z) - A Classifier-Based Approach to Multi-Class Anomaly Detection Applied to Astronomical Time-Series [0.0]
anomaly detection is an open problem in many scientific fields.
Most anomaly detection algorithms for astronomical time-series rely either on hand-crafted features or on features generated through unsupervised representation learning.
We introduce a novel approach that leverages the latent space of a neural network classifier for anomaly detection.
arXiv Detail & Related papers (2024-08-05T18:00:00Z) - Lazy Layers to Make Fine-Tuned Diffusion Models More Traceable [70.77600345240867]
A novel arbitrary-in-arbitrary-out (AIAO) strategy makes watermarks resilient to fine-tuning-based removal.
Unlike the existing methods of designing a backdoor for the input/output space of diffusion models, in our method, we propose to embed the backdoor into the feature space of sampled subpaths.
Our empirical studies on the MS-COCO, AFHQ, LSUN, CUB-200, and DreamBooth datasets confirm the robustness of AIAO.
arXiv Detail & Related papers (2024-05-01T12:03:39Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - LSTM and CNN application for core-collapse supernova search in
gravitational wave real data [0.0]
Core-collapse supernovae (CCSNe) are expected to emit gravitational wave signals that could be detected by interferometers within the Milky Way and nearby galaxies.
We show potential of machine learning (ML) for multi-label classification of different CCSNe simulated signals and noise transients using real data.
arXiv Detail & Related papers (2023-01-23T12:12:33Z) - Space-based gravitational wave signal detection and extraction with deep
neural network [13.176946557548042]
Space-based gravitational wave (GW) detectors will be able to observe signals from sources that are otherwise nearly impossible from current ground-based detection.
Here, we develop a high-accuracy GW signal detection and extraction method for all space-based GW sources.
arXiv Detail & Related papers (2022-07-15T11:48:15Z) - StRegA: Unsupervised Anomaly Detection in Brain MRIs using a Compact
Context-encoding Variational Autoencoder [48.2010192865749]
Unsupervised anomaly detection (UAD) can learn a data distribution from an unlabelled dataset of healthy subjects and then be applied to detect out of distribution samples.
This research proposes a compact version of the "context-encoding" VAE (ceVAE) model, combined with pre and post-processing steps, creating a UAD pipeline (StRegA)
The proposed pipeline achieved a Dice score of 0.642$pm$0.101 while detecting tumours in T2w images of the BraTS dataset and 0.859$pm$0.112 while detecting artificially induced anomalies.
arXiv Detail & Related papers (2022-01-31T14:27:35Z) - Simulating quench dynamics on a digital quantum computer with
data-driven error mitigation [62.997667081978825]
We present one of the first implementations of several Clifford data regression based methods which are used to mitigate the effect of noise in real quantum data.
We find in general Clifford data regression based techniques are advantageous in comparison with zero-noise extrapolation.
This is the largest systems investigated so far in a study of this type.
arXiv Detail & Related papers (2021-03-23T16:56:14Z) - Improving significance of binary black hole mergers in Advanced LIGO
data using deep learning : Confirmation of GW151216 [0.0]
We present a novel Machine Learning (ML) based strategy to search for binary black hole (BBH) mergers in data from ground-based gravitational wave (GW) observatories.
This is the first ML-based search that not only recovers all the compact binary coalescences (CBCs) in the first GW transients catalog (GWTC-1), but also makes a clean detection of GW151216.
arXiv Detail & Related papers (2020-10-16T18:27:48Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - Deep learning for gravitational-wave data analysis: A resampling
white-box approach [62.997667081978825]
We apply Convolutional Neural Networks (CNNs) to detect gravitational wave (GW) signals of compact binary coalescences, using single-interferometer data from LIGO detectors.
CNNs were quite precise to detect noise but not sensitive enough to recall GW signals, meaning that CNNs are better for noise reduction than generation of GW triggers.
arXiv Detail & Related papers (2020-09-09T03:28:57Z) - Complete parameter inference for GW150914 using deep learning [0.0]
LIGO and Virgo gravitational-wave observatories have detected many exciting events over the past five years.
As the rate of detections grows with detector sensitivity, this poses a growing computational challenge for data analysis.
We apply deep learning techniques to perform fast likelihood-free Bayesian inference for gravitational waves.
arXiv Detail & Related papers (2020-08-07T18:00:02Z) - Detection of gravitational-wave signals from binary neutron star mergers
using machine learning [52.77024349608834]
We introduce a novel neural-network based machine learning algorithm that uses time series strain data from gravitational-wave detectors.
We find an improvement by a factor of 6 in sensitivity to signals with signal-to-noise ratio below 25.
A conservative estimate indicates that our algorithm introduces on average 10.2 s of latency between signal arrival and generating an alert.
arXiv Detail & Related papers (2020-06-02T10:20:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.