Learning Signal Temporal Logic through Neural Network for Interpretable
Classification
- URL: http://arxiv.org/abs/2210.01910v2
- Date: Fri, 30 Jun 2023 18:39:19 GMT
- Title: Learning Signal Temporal Logic through Neural Network for Interpretable
Classification
- Authors: Danyang Li, Mingyu Cai, Cristian-Ioan Vasile, Roberto Tron
- Abstract summary: We propose an explainable neural-symbolic framework for the classification of time-series behaviors.
We demonstrate the computational efficiency, compactness, and interpretability of the proposed method through driving scenarios and naval surveillance case studies.
- Score: 13.829082181692872
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning techniques using neural networks have achieved promising
success for time-series data classification. However, the models that they
produce are challenging to verify and interpret. In this paper, we propose an
explainable neural-symbolic framework for the classification of time-series
behaviors. In particular, we use an expressive formal language, namely Signal
Temporal Logic (STL), to constrain the search of the computation graph for a
neural network. We design a novel time function and sparse softmax function to
improve the soundness and precision of the neural-STL framework. As a result,
we can efficiently learn a compact STL formula for the classification of
time-series data through off-the-shelf gradient-based tools. We demonstrate the
computational efficiency, compactness, and interpretability of the proposed
method through driving scenarios and naval surveillance case studies, compared
with state-of-the-art baselines.
Related papers
- Time Elastic Neural Networks [2.1756081703276]
We introduce and detail an atypical neural network architecture, called time elastic neural network (teNN)
The novelty compared to classical neural network architecture is that it explicitly incorporates time warping ability.
We demonstrate that, during the training process, the teNN succeeds in reducing the number of neurons required within each cell.
arXiv Detail & Related papers (2024-05-27T09:01:30Z) - MTS2Graph: Interpretable Multivariate Time Series Classification with
Temporal Evolving Graphs [1.1756822700775666]
We introduce a new framework for interpreting time series data by extracting and clustering the input representative patterns.
We run experiments on eight datasets of the UCR/UEA archive, along with HAR and PAM datasets.
arXiv Detail & Related papers (2023-06-06T16:24:27Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Weighted Graph-Based Signal Temporal Logic Inference Using Neural
Networks [3.2773502246783237]
We train neural networks to learn spatial-temporal properties in the form of weighted graph-based signal temporal logic (w GSTL) formulas.
We use a COVID-19 dataset and a rain prediction dataset to evaluate the performance of the proposed framework.
The classification accuracy obtained by the proposed framework is comparable with the baseline classification methods.
arXiv Detail & Related papers (2021-09-16T16:06:54Z) - FF-NSL: Feed-Forward Neural-Symbolic Learner [70.978007919101]
This paper introduces a neural-symbolic learning framework, called Feed-Forward Neural-Symbolic Learner (FF-NSL)
FF-NSL integrates state-of-the-art ILP systems based on the Answer Set semantics, with neural networks, in order to learn interpretable hypotheses from labelled unstructured data.
arXiv Detail & Related papers (2021-06-24T15:38:34Z) - Meta-Learning for Koopman Spectral Analysis with Short Time-series [49.41640137945938]
Existing methods require long time-series for training neural networks.
We propose a meta-learning method for estimating embedding functions from unseen short time-series.
We experimentally demonstrate that the proposed method achieves better performance in terms of eigenvalue estimation and future prediction.
arXiv Detail & Related papers (2021-02-09T07:19:19Z) - NSL: Hybrid Interpretable Learning From Noisy Raw Data [66.15862011405882]
This paper introduces a hybrid neural-symbolic learning framework, called NSL, that learns interpretable rules from labelled unstructured data.
NSL combines pre-trained neural networks for feature extraction with FastLAS, a state-of-the-art ILP system for rule learning under the answer set semantics.
We demonstrate that NSL is able to learn robust rules from MNIST data and achieve comparable or superior accuracy when compared to neural network and random forest baselines.
arXiv Detail & Related papers (2020-12-09T13:02:44Z) - Graph Neural Networks for Leveraging Industrial Equipment Structure: An
application to Remaining Useful Life Estimation [21.297461316329453]
We propose to capture the structure of a complex equipment in the form of a graph, and use graph neural networks (GNNs) to model multi-sensor time-series data.
We observe that the proposed GNN-based RUL estimation model compares favorably to several strong baselines from literature such as those based on RNNs and CNNs.
arXiv Detail & Related papers (2020-06-30T06:38:08Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.