Deep Neural Networks for Approximating Stream Reasoning with C-SPARQL
- URL: http://arxiv.org/abs/2106.08452v1
- Date: Tue, 15 Jun 2021 21:51:47 GMT
- Title: Deep Neural Networks for Approximating Stream Reasoning with C-SPARQL
- Authors: Ricardo Ferreira, Carolina Lopes, Ricardo Gon\c{c}alves, Matthias
Knorr, Ludwig Krippahl, Jo\~ao Leite
- Abstract summary: C-SPARQL is a language for continuous queries over streams of RDF data.
We investigate whether reasoning with C-SPARQL can be approximated using Recurrent Neural Networks and Convolutional Neural Networks.
- Score: 0.8677532138573983
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The amount of information produced, whether by newspapers, blogs and social
networks, or by monitoring systems, is increasing rapidly. Processing all this
data in real-time, while taking into consideration advanced knowledge about the
problem domain, is challenging, but required in scenarios where assessing
potential risks in a timely fashion is critical. C-SPARQL, a language for
continuous queries over streams of RDF data, is one of the more prominent
approaches in stream reasoning that provides such continuous inference
capabilities over dynamic data that go beyond mere stream processing. However,
it has been shown that, in the presence of huge amounts of data, C-SPARQL may
not be able to answer queries in time, in particular when the frequency of
incoming data is higher than the time required for reasoning with that data. In
this paper, we investigate whether reasoning with C-SPARQL can be approximated
using Recurrent Neural Networks and Convolutional Neural Networks, two neural
network architectures that have been shown to be well-suited for time series
forecasting and time series classification, to leverage on their higher
processing speed once the network has been trained. We consider a variety of
different kinds of queries and obtain overall positive results with high
accuracies while improving processing time often by several orders of
magnitude.
Related papers
- Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [55.09326865401653]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - Temporal Convolution Derived Multi-Layered Reservoir Computing [5.261277318790788]
We propose a new mapping of input data into the reservoir's state space.
We incorporate this method in two novel network architectures increasing parallelizability, depth and predictive capabilities of the neural network.
For the chaotic time series, we observe an error reduction of up to $85.45%$ compared to Echo State Networks and $90.72%$ compared to Gated Recurrent Units.
arXiv Detail & Related papers (2024-07-09T11:40:46Z) - State-Space Modeling in Long Sequence Processing: A Survey on Recurrence in the Transformer Era [59.279784235147254]
This survey provides an in-depth summary of the latest approaches that are based on recurrent models for sequential data processing.
The emerging picture suggests that there is room for thinking of novel routes, constituted by learning algorithms which depart from the standard Backpropagation Through Time.
arXiv Detail & Related papers (2024-06-13T12:51:22Z) - A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting [1.9950682531209158]
We provide an approach to link time series characteristics with RNN components via the versatile metric of distance correlation.
We empirically show that the RNN activation layers learn the lag structures of time series well.
We also show that the activation layers cannot adequately model moving average and heteroskedastic time series processes.
arXiv Detail & Related papers (2023-07-28T22:32:08Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Faster than LASER -- Towards Stream Reasoning with Deep Neural Networks [0.6649973446180738]
Stream Reasoners aim at bridging this gap between reasoning and stream processing.
LASER is a stream reasoner designed to analyse and perform complex reasoning over streams of data.
We study whether Convolutional and Recurrent Neural Networks, which have shown to be particularly well-suited for time series forecasting and classification, can be trained to approximate reasoning with LASER.
arXiv Detail & Related papers (2021-06-15T22:06:12Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Multivariate Time Series Classification Using Spiking Neural Networks [7.273181759304122]
Spiking neural network has drawn attention as it enables low power consumption.
We present an encoding scheme to convert time series into sparse spatial temporal spike patterns.
A training algorithm to classify spatial temporal patterns is also proposed.
arXiv Detail & Related papers (2020-07-07T15:24:01Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.