Tool-Supported Architecture-Based Data Flow Analysis for Confidentiality
- URL: http://arxiv.org/abs/2308.01645v1
- Date: Thu, 3 Aug 2023 09:21:20 GMT
- Title: Tool-Supported Architecture-Based Data Flow Analysis for Confidentiality
- Authors: Felix Schwickerath, Nicolas Boltz, Sebastian Hahner, Maximilian
Walter, Christopher Gerking, Robert Heinrich
- Abstract summary: We reimplemented a data flow analysis as a Java-based tool to identify access violations based on the data flow.
The evaluation for our tool indicates that we can analyze similar scenarios and scale for certain scenarios better than the existing analysis.
- Score: 1.6544671438664054
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Through the increasing interconnection between various systems, the need for
confidential systems is increasing. Confidential systems share data only with
authorized entities. However, estimating the confidentiality of a system is
complex, and adjusting an already deployed software is costly. Thus, it is
helpful to have confidentiality analyses, which can estimate the
confidentiality already at design time. Based on an existing data-flow-based
confidentiality analysis concept, we reimplemented a data flow analysis as a
Java-based tool. The tool uses the software architecture to identify access
violations based on the data flow. The evaluation for our tool indicates that
we can analyze similar scenarios and scale for certain scenarios better than
the existing analysis.
Related papers
- Detecting Vulnerabilities in Encrypted Software Code while Ensuring Code Privacy [1.6291732599945337]
Testing companies can perform code analysis tasks on encrypted software code provided by software companies while code privacy is preserved.
The approach combines Static Code Analysis and Searchable Symmetric Encryption.
The index is then used to discover vulnerabilities by carrying out static analysis tasks in a confidential way.
arXiv Detail & Related papers (2025-01-15T22:39:50Z) - Federated Analytics in Practice: Engineering for Privacy, Scalability and Practicality [5.276674920508729]
Cross-device Federated Analytics (FA) is a distributed computation paradigm designed to answer analytics queries about and derive insights from data held locally on users' devices.
Despite FA's broad relevance, the applicability of existing FA systems is limited by compromised accuracy; lack of flexibility for data analytics; and an inability to scale effectively.
We describe our approach to combine privacy, scalability, and practicality to build and deploy a system that overcomes these limitations.
arXiv Detail & Related papers (2024-12-03T10:03:12Z) - Privacy-Preserving Intrusion Detection using Convolutional Neural Networks [0.25163931116642785]
We explore the use case of a model owner providing an analytic service on customer's private data.
No information about the data shall be revealed to the analyst and no information about the model shall be leaked to the customer.
We enhance an attack detection system based on Convolutional Neural Networks with privacy-preserving technology based on PriMIA framework.
arXiv Detail & Related papers (2024-04-15T09:56:36Z) - An Extensible Framework for Architecture-Based Data Flow Analysis for Information Security [1.7749883815108154]
Security-related properties are often analyzed based on data flow diagrams (DFDs)
We present an open and framework for data flow analysis.
The framework is compatible with DFDs and can also extract data flows from the Palladio architectural description language.
arXiv Detail & Related papers (2024-03-14T13:52:41Z) - It Is Time To Steer: A Scalable Framework for Analysis-driven Attack Graph Generation [50.06412862964449]
Attack Graph (AG) represents the best-suited solution to support cyber risk assessment for multi-step attacks on computer networks.
Current solutions propose to address the generation problem from the algorithmic perspective and postulate the analysis only after the generation is complete.
This paper rethinks the classic AG analysis through a novel workflow in which the analyst can query the system anytime.
arXiv Detail & Related papers (2023-12-27T10:44:58Z) - Privacy-Preserving Graph Machine Learning from Data to Computation: A
Survey [67.7834898542701]
We focus on reviewing privacy-preserving techniques of graph machine learning.
We first review methods for generating privacy-preserving graph data.
Then we describe methods for transmitting privacy-preserved information.
arXiv Detail & Related papers (2023-07-10T04:30:23Z) - Enabling Inter-organizational Analytics in Business Networks Through
Meta Machine Learning [0.0]
Fear of disclosing sensitive information as well as the sheer volume of the data that would need to be exchanged are key inhibitors for the creation of effective system-wide solutions.
We propose a meta machine learning method that deals with these obstacles to enable comprehensive analyses within a business network.
arXiv Detail & Related papers (2023-03-28T09:06:28Z) - Distributed intelligence on the Edge-to-Cloud Continuum: A systematic
literature review [62.997667081978825]
This review aims at providing a comprehensive vision of the main state-of-the-art libraries and frameworks for machine learning and data analytics available today.
The main simulation, emulation, deployment systems, and testbeds for experimental research on the Edge-to-Cloud Continuum available today are also surveyed.
arXiv Detail & Related papers (2022-04-29T08:06:05Z) - Federated Stochastic Gradient Descent Begets Self-Induced Momentum [151.4322255230084]
Federated learning (FL) is an emerging machine learning method that can be applied in mobile edge systems.
We show that running to the gradient descent (SGD) in such a setting can be viewed as adding a momentum-like term to the global aggregation process.
arXiv Detail & Related papers (2022-02-17T02:01:37Z) - Differential privacy and robust statistics in high dimensions [49.50869296871643]
High-dimensional Propose-Test-Release (HPTR) builds upon three crucial components: the exponential mechanism, robust statistics, and the Propose-Test-Release mechanism.
We show that HPTR nearly achieves the optimal sample complexity under several scenarios studied in the literature.
arXiv Detail & Related papers (2021-11-12T06:36:40Z) - Sensitivity analysis in differentially private machine learning using
hybrid automatic differentiation [54.88777449903538]
We introduce a novel textithybrid automatic differentiation (AD) system for sensitivity analysis.
This enables modelling the sensitivity of arbitrary differentiable function compositions, such as the training of neural networks on private data.
Our approach can enable the principled reasoning about privacy loss in the setting of data processing.
arXiv Detail & Related papers (2021-07-09T07:19:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.