Steganography of Steganographic Networks
- URL: http://arxiv.org/abs/2302.14521v1
- Date: Tue, 28 Feb 2023 12:27:34 GMT
- Title: Steganography of Steganographic Networks
- Authors: Guobiao Li, Sheng Li, Meiling Li, Xinpeng Zhang, Zhenxing Qian
- Abstract summary: Steganography is a technique for covert communication between two parties.
We propose a novel scheme for steganography of steganographic networks in this paper.
- Score: 23.85364443400414
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Steganography is a technique for covert communication between two parties.
With the rapid development of deep neural networks (DNN), more and more
steganographic networks are proposed recently, which are shown to be promising
to achieve good performance. Unlike the traditional handcrafted steganographic
tools, a steganographic network is relatively large in size. It raises concerns
on how to covertly transmit the steganographic network in public channels,
which is a crucial stage in the pipeline of steganography in real world
applications. To address such an issue, we propose a novel scheme for
steganography of steganographic networks in this paper. Unlike the existing
steganographic schemes which focus on the subtle modification of the cover data
to accommodate the secrets. We propose to disguise a steganographic network
(termed as the secret DNN model) into a stego DNN model which performs an
ordinary machine learning task (termed as the stego task). During the model
disguising, we select and tune a subset of filters in the secret DNN model to
preserve its function on the secret task, where the remaining filters are
reactivated according to a partial optimization strategy to disguise the whole
secret DNN model into a stego DNN model. The secret DNN model can be recovered
from the stego DNN model when needed. Various experiments have been conducted
to demonstrate the advantage of our proposed method for covert communication of
steganographic networks as well as general DNN models.
Related papers
- Stealing Training Graphs from Graph Neural Networks [54.52392250297907]
Graph Neural Networks (GNNs) have shown promising results in modeling graphs in various tasks.
As neural networks can memorize the training samples, the model parameters of GNNs have a high risk of leaking private training data.
We investigate a novel problem of stealing graphs from trained GNNs.
arXiv Detail & Related papers (2024-11-17T23:15:36Z) - Cover-separable Fixed Neural Network Steganography via Deep Generative Models [37.08937194546323]
We propose a Cover-separable Fixed Neural Network Steganography, namely Cs-FNNS.
In Cs-FNNS, we propose a Steganographic Perturbation Search (SPS) algorithm to directly encode the secret data into an imperceptible perturbation.
We demonstrate the superior performance of the proposed method in terms of visual quality and undetectability.
arXiv Detail & Related papers (2024-07-16T05:47:06Z) - CNN2GNN: How to Bridge CNN with GNN [59.42117676779735]
We propose a novel CNN2GNN framework to unify CNN and GNN together via distillation.
The performance of distilled boosted'' two-layer GNN on Mini-ImageNet is much higher than CNN containing dozens of layers such as ResNet152.
arXiv Detail & Related papers (2024-04-23T08:19:08Z) - Graph Coordinates and Conventional Neural Networks -- An Alternative for
Graph Neural Networks [0.10923877073891444]
We propose Topology Coordinate Neural Network (TCNN) and Directional Virtual Coordinate Neural Network (DVCNN) as novel alternatives to message passing GNNs.
TCNN and DVCNN achieve competitive or superior performance to message passing GNNs.
Our work expands the toolbox of techniques for graph-based machine learning.
arXiv Detail & Related papers (2023-12-03T10:14:10Z) - Towards Deep Network Steganography: From Networks to Networks [23.853644434004135]
We propose deep network steganography for the covert communication of DNN models.
Our scheme is learning task oriented, where the learning task of the secret DNN model is disguised into another ordinary learning task.
We conduct experiments on both the intra-task steganography and inter-task steganography.
arXiv Detail & Related papers (2023-07-07T08:02:01Z) - CNN-Assisted Steganography -- Integrating Machine Learning with
Established Steganographic Techniques [5.0468312081378475]
We propose a method to improve steganography by increasing the resilience of stego-media to discovery through steganalysis.
Our approach enhances a class of steganographic approaches through the inclusion of a steganographic assistant convolutional neural network (SA-CNN)
Our results show that such steganalyzers are less effective when SA-CNN is employed during the generation of a stego-image.
arXiv Detail & Related papers (2023-04-25T00:19:23Z) - GrOVe: Ownership Verification of Graph Neural Networks using Embeddings [13.28269672097063]
Graph neural networks (GNNs) have emerged as a state-of-the-art approach to model and draw inferences from large scale graph-structured data.
Prior work has shown that GNNs are prone to model extraction attacks.
We present GrOVe, a state-of-the-art GNN model fingerprinting scheme.
arXiv Detail & Related papers (2023-04-17T19:06:56Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Attentive Graph Neural Networks for Few-Shot Learning [74.01069516079379]
Graph Neural Networks (GNN) has demonstrated the superior performance in many challenging applications, including the few-shot learning tasks.
Despite its powerful capacity to learn and generalize the model from few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep.
We propose a novel Attentive GNN to tackle these challenges, by incorporating a triple-attention mechanism.
arXiv Detail & Related papers (2020-07-14T07:43:09Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.