Path Signatures and Graph Neural Networks for Slow Earthquake Analysis:
Better Together?
- URL: http://arxiv.org/abs/2402.03558v1
- Date: Mon, 5 Feb 2024 22:16:05 GMT
- Title: Path Signatures and Graph Neural Networks for Slow Earthquake Analysis:
Better Together?
- Authors: Hans Riess, Manolis Veveakis, Michael M. Zavlanos
- Abstract summary: We introduce a novel approach, Path Signature Graph Convolutional Networks (PS-GCNN), integrating path signatures into graph convolutional neural networks (GCNN)
We apply our method to analyze slow earthquake sequences, also called slow slip events (SSE), utilizing data from GPS timeseries.
Our methodology shows promise for future advancement in earthquake prediction and sensor network analysis.
- Score: 10.002197953627359
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The path signature, having enjoyed recent success in the machine learning
community, is a theoretically-driven method for engineering features from
irregular paths. On the other hand, graph neural networks (GNN), neural
architectures for processing data on graphs, excel on tasks with irregular
domains, such as sensor networks. In this paper, we introduce a novel approach,
Path Signature Graph Convolutional Neural Networks (PS-GCNN), integrating path
signatures into graph convolutional neural networks (GCNN), and leveraging the
strengths of both path signatures, for feature extraction, and GCNNs, for
handling spatial interactions. We apply our method to analyze slow earthquake
sequences, also called slow slip events (SSE), utilizing data from GPS
timeseries, with a case study on a GPS sensor network on the east coast of New
Zealand's north island. We also establish benchmarks for our method on
simulated stochastic differential equations, which model similar
reaction-diffusion phenomenon. Our methodology shows promise for future
advancement in earthquake prediction and sensor network analysis.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - A Finite Element-Inspired Hypergraph Neural Network: Application to
Fluid Dynamics Simulations [4.984601297028257]
An emerging trend in deep learning research focuses on the applications of graph neural networks (GNNs) for continuum mechanics simulations.
We present a method to construct a hypergraph by connecting the nodes by elements rather than edges.
We term this method a finite element-inspired hypergraph neural network, in short FEIH($phi$)-GNN.
arXiv Detail & Related papers (2022-12-30T04:10:01Z) - LHNN: Lattice Hypergraph Neural Network for VLSI Congestion Prediction [70.31656245793302]
lattice hypergraph (LH-graph) is a novel graph formulation for circuits.
LHNN constantly achieves more than 35% improvements compared with U-nets and Pix2Pix on the F1 score.
arXiv Detail & Related papers (2022-03-24T03:31:18Z) - Charged particle tracking via edge-classifying interaction networks [0.0]
In this work, we adapt the physics-motivated interaction network (IN) GNN to the problem of charged-particle tracking in the high-pileup conditions expected at the HL-LHC.
We demonstrate the IN's excellent edge-classification accuracy and tracking efficiency through a suite of measurements at each stage of GNN-based tracking.
The proposed IN architecture is substantially smaller than previously studied GNN tracking architectures, a reduction in size critical for enabling GNN-based tracking in constrained computing environments.
arXiv Detail & Related papers (2021-03-30T21:58:52Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Map-Based Temporally Consistent Geolocalization through Learning Motion
Trajectories [0.5076419064097732]
We propose a novel trajectory learning method that exploits motion trajectories on topological map using recurrent neural network.
Inspired by human's ability to both be aware of distance and direction of self-motion in navigation, our trajectory learning method learns a pattern representation of trajectories encoded as a sequence of distances and turning angles to assist self-localization.
arXiv Detail & Related papers (2020-10-13T02:08:45Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Inductive Graph Neural Networks for Spatiotemporal Kriging [13.666589510218738]
We develop an inductive graph neural network model to recover data for unsampled sensors on a network/graph structure.
Empirical results on several real-worldtemporal datasets demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2020-06-13T01:23:44Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.