Semantics-enhanced Temporal Graph Networks for Content Caching and
Energy Saving
- URL: http://arxiv.org/abs/2301.12355v2
- Date: Thu, 2 Feb 2023 09:29:35 GMT
- Title: Semantics-enhanced Temporal Graph Networks for Content Caching and
Energy Saving
- Authors: Jianhang Zhu, Rongpeng Li, Xianfu Chen, Shiwen Mao, Jianjun Wu,
Zhifeng Zhao
- Abstract summary: We propose a reformative temporal graph network, named STGN, that utilizes extra semantic messages to enhance the temporal and structural learning of a DGNN model.
We also propose a user-specific attention mechanism to fine-grainedly aggregate various semantics.
- Score: 21.693946854653785
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The enormous amount of network equipment and users implies a tremendous
growth of Internet traffic for multimedia services. To mitigate the traffic
pressure, architectures with in-network storage are proposed to cache popular
content at nodes in close proximity to users to shorten the backhaul links.
Meanwhile, the reduction of transmission distance also contributes to the
energy saving. However, due to limited storage, only a fraction of the content
can be cached, while caching the most popular content is cost-effective.
Correspondingly, it becomes essential to devise an effective popularity
prediction method. In this regard, existing efforts adopt dynamic graph neural
network (DGNN) models, but it remains challenging to tackle sparse datasets. In
this paper, we first propose a reformative temporal graph network, which is
named STGN, that utilizes extra semantic messages to enhance the temporal and
structural learning of a DGNN model, since the consideration of semantics can
help establish implicit paths within the sparse interaction graph and hence
improve the prediction performance. Furthermore, we propose a user-specific
attention mechanism to fine-grainedly aggregate various semantics. Finally,
extensive simulations verify the superiority of our STGN models and demonstrate
their high potential in energy-saving.
Related papers
- Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - CLSA: Contrastive Learning-based Survival Analysis for Popularity
Prediction in MEC Networks [36.01752474571776]
Mobile Edge Caching (MEC) integrated with Deep Neural Networks (DNNs) is an innovative technology with significant potential for the future generation of wireless networks.
The MEC network's effectiveness heavily relies on its capacity to predict and dynamically update the storage of caching nodes with the most popular contents.
To be effective, a DNN-based popularity prediction model needs to have the ability to understand the historical request patterns of content.
arXiv Detail & Related papers (2023-03-21T15:57:46Z) - EGRC-Net: Embedding-induced Graph Refinement Clustering Network [66.44293190793294]
We propose a novel graph clustering network called Embedding-Induced Graph Refinement Clustering Network (EGRC-Net)
EGRC-Net effectively utilizes the learned embedding to adaptively refine the initial graph and enhance the clustering performance.
Our proposed methods consistently outperform several state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-19T09:08:43Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - AoI-based Temporal Attention Graph Neural Network for Popularity
Prediction and Content Caching [9.16219929722585]
Information-centric network (ICN) aims to proactively keep limited popular content at the edge of network based on predicted results.
In this paper, we leverage an effective dynamic graph neural network (DGNN) to jointly learn the structural and temporal patterns embedded in the bipartite graph.
We also propose an age of information (AoI) based attention mechanism to extract valuable historical information.
arXiv Detail & Related papers (2022-08-18T02:57:17Z) - TEDGE-Caching: Transformer-based Edge Caching Towards 6G Networks [30.160404936777947]
Mobile Edge Caching (MEC) in the 6G networks has been evolved as an efficient solution to meet the phenomenal growth of the global mobile data traffic.
Recent advancements in Deep Neural Networks (DNNs) have drawn much research attention to predict the content popularity in proactive caching schemes.
We propose an edge caching framework incorporated with the attention-based Vision Transformer (ViT) neural network, referred to as the Transformer-based Edge (TEDGE) caching.
arXiv Detail & Related papers (2021-12-01T16:38:18Z) - Semi-supervised Network Embedding with Differentiable Deep Quantisation [81.49184987430333]
We develop d-SNEQ, a differentiable quantisation method for network embedding.
d-SNEQ incorporates a rank loss to equip the learned quantisation codes with rich high-order information.
It is able to substantially compress the size of trained embeddings, thus reducing storage footprint and accelerating retrieval speed.
arXiv Detail & Related papers (2021-08-20T11:53:05Z) - MAF-GNN: Multi-adaptive Spatiotemporal-flow Graph Neural Network for
Traffic Speed Forecasting [3.614768552081925]
We propose a Multi-adaptive Spatiotemporal-flow Graph Neural Network (MAF-GNN) for traffic speed forecasting.
MAF-GNN introduces an effective Multi-adaptive Adjacency Matrices Mechanism to capture multiple latent spatial dependencies between traffic nodes.
It achieves better performance than other models on two real-world datasets of public traffic network, METR-LA and PeMS-Bay.
arXiv Detail & Related papers (2021-08-08T09:06:43Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.