Benchmarking GNNs Using Lightning Network Data
- URL: http://arxiv.org/abs/2407.07916v1
- Date: Fri, 5 Jul 2024 20:35:57 GMT
- Title: Benchmarking GNNs Using Lightning Network Data
- Authors: Rainer Feichtinger, Florian Grötschla, Lioba Heimbach, Roger Wattenhofer,
- Abstract summary: Bitcoin Lightning Network is a layer 2 protocol designed to facilitate fast and inexpensive Bitcoin transactions.
We analyze the graph structure of the Lightning Network and investigate the statistical relationships between node properties using machine learning.
- Score: 20.204489886284534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Bitcoin Lightning Network is a layer 2 protocol designed to facilitate fast and inexpensive Bitcoin transactions. It operates by establishing channels between users, where Bitcoin is locked and transactions are conducted off-chain until the channels are closed, with only the initial and final transactions recorded on the blockchain. Routing transactions through intermediary nodes is crucial for users without direct channels, allowing these routing nodes to collect fees for their services. Nodes announce their channels to the network, forming a graph with channels as edges. In this paper, we analyze the graph structure of the Lightning Network and investigate the statistical relationships between node properties using machine learning, particularly Graph Neural Networks (GNNs). We formulate a series of tasks to explore these relationships and provide benchmarks for GNN architectures, demonstrating how topological and neighbor information enhances performance. Our evaluation of several models reveals the effectiveness of GNNs in these tasks and highlights the insights gained from their application.
Related papers
- DAG-Sword: A Simulator of Large-Scale Network Topologies for DAG-Oriented Proof-of-Work Blockchains [2.0124254762298794]
We focus on DAG-based consensus protocols and present a discrete-event simulator for them.
Our simulator can simulate realistic blockchain networks created from data of a Bitcoin network.
We extend the results of the related work that contains a small-scale network of 10 nodes by the results obtained on a large-scale network with 7000 nodes.
arXiv Detail & Related papers (2023-11-08T12:31:11Z) - Analysis of Information Propagation in Ethereum Network Using Combined
Graph Attention Network and Reinforcement Learning to Optimize Network
Efficiency and Scalability [2.795656498870966]
We develop a Graph Attention Network (GAT) and Reinforcement Learning (RL) model to optimize the network efficiency and scalability.
In the experimental evaluation, we analyze the performance of our model on a large-scale dataset.
The results indicate that our designed GAT-RL model achieves superior results compared to other GCN models in terms of performance.
arXiv Detail & Related papers (2023-11-02T17:19:45Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - Chainlet Orbits: Topological Address Embedding for the Bitcoin
Blockchain [15.099255988459602]
Rise of cryptocurrencies like Bitcoin, which enable transactions with a degree of pseudonymity, has led to a surge in various illicit activities.
We introduce an effective solution called Chainlet Orbits to embed Bitcoin addresses by leveraging their topological characteristics in transactions.
Our approach enables the use of interpretable and explainable machine learning models in as little as 15 minutes for most days on the Bitcoin transaction network.
arXiv Detail & Related papers (2023-05-18T21:16:59Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - TSGN: Transaction Subgraph Networks for Identifying Ethereum Phishing
Accounts [2.3112192919085826]
Transaction SubGraph Network (TSGN) based classification model to identify phishing accounts.
We find that TSGNs can provide more potential information to benefit the identification of phishing accounts.
arXiv Detail & Related papers (2021-04-18T08:12:51Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.