Safety-Critical Adaptation in Self-Adaptive Systems
- URL: http://arxiv.org/abs/2210.00095v1
- Date: Fri, 30 Sep 2022 21:16:34 GMT
- Title: Safety-Critical Adaptation in Self-Adaptive Systems
- Authors: Simon Diemert, Jens H. Weber
- Abstract summary: This paper proposes a definition of a safety-critical self-adaptive system.
It describes a taxonomy for classifying adaptations into different types based on their impact on the system's safety and the system's safety case.
Each type in the taxonomy is illustrated using the example of a safety-critical self-adaptive water heating system.
- Score: 1.599072005190786
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern systems are designed to operate in increasingly variable and uncertain
environments. Not only are these environments complex, in the sense that they
contain a tremendous number of variables, but they also change over time.
Systems must be able to adjust their behaviour at run-time to manage these
uncertainties. These self-adaptive systems have been studied extensively. This
paper proposes a definition of a safety-critical self-adaptive system and then
describes a taxonomy for classifying adaptations into different types based on
their impact on the system's safety and the system's safety case. The taxonomy
expresses criteria for classification and then describes specific criteria that
the safety case for a self-adaptive system must satisfy, depending on the type
of adaptations performed. Each type in the taxonomy is illustrated using the
example of a safety-critical self-adaptive water heating system.
Related papers
- Towards Formal Fault Injection for Safety Assessment of Automated
Systems [0.0]
This paper introduces formal fault injection, a fusion of these two techniques throughout the development lifecycle.
We advocate for a more cohesive approach by identifying five areas of mutual support between formal methods and fault injection.
arXiv Detail & Related papers (2023-11-16T11:34:18Z) - "One-Size-Fits-All"? Examining Expectations around What Constitute "Fair" or "Good" NLG System Behaviors [57.63649797577999]
We conduct case studies in which we perturb different types of identity-related language features (names, roles, locations, dialect, and style) in NLG system inputs.
We find that motivations for adaptation include social norms, cultural differences, feature-specific information, and accommodation.
In contrast, motivations for invariance include perspectives that favor prescriptivism, view adaptation as unnecessary or too difficult for NLG systems to do appropriately, and are wary of false assumptions.
arXiv Detail & Related papers (2023-10-23T23:00:34Z) - DARTH: Holistic Test-time Adaptation for Multiple Object Tracking [87.72019733473562]
Multiple object tracking (MOT) is a fundamental component of perception systems for autonomous driving.
Despite the urge of safety in driving systems, no solution to the MOT adaptation problem to domain shift in test-time conditions has ever been proposed.
We introduce DARTH, a holistic test-time adaptation framework for MOT.
arXiv Detail & Related papers (2023-10-03T10:10:42Z) - Towards Model Co-evolution Across Self-Adaptation Steps for Combined
Safety and Security Analysis [44.339753503750735]
We present several models that describe different aspects of a self-adaptive system.
We outline our idea of how these models can then be combined into an Attack-Fault Tree.
arXiv Detail & Related papers (2023-09-18T10:35:40Z) - Awareness requirement and performance management for adaptive systems: a
survey [13.406015141662879]
Self-adaptive software can modify its behavior when the assessment indicates that the program is not performing as intended or when improved functionality or performance is available.
This paper presents a review of self-adaptive systems in the context of requirement awareness and summarizes the most common methodologies applied.
arXiv Detail & Related papers (2023-01-22T14:27:11Z) - Dealing with Drift of Adaptation Spaces in Learning-based Self-Adaptive
Systems using Lifelong Self-Adaptation [10.852698169509006]
We focus on a particularly important challenge for learning-based self-adaptive systems: drift in adaptation spaces.
Drift of adaptation spaces originates from uncertainties, affecting the quality properties of the adaptation options.
We present a novel approach to self-adaptation that enhances learning-based self-adaptive systems with a lifelong ML layer.
arXiv Detail & Related papers (2022-11-04T07:45:48Z) - Recursively Feasible Probabilistic Safe Online Learning with Control
Barrier Functions [63.18590014127461]
This paper introduces a model-uncertainty-aware reformulation of CBF-based safety-critical controllers.
We study the feasibility of the resulting robust safety-critical controller.
We then use these conditions to devise an event-triggered online data collection strategy.
arXiv Detail & Related papers (2022-08-23T05:02:09Z) - Auto-COP: Adaptation Generation in Context-Oriented Programming using
Reinforcement Learning Options [2.984934409689467]
We propose Auto-COP, a new technique to enable generation of adaptations at run time.
We present two case studies exhibiting different system characteristics and application domains.
We confirm that the generated adaptations exhibit correct system behavior measured by domain-specific performance metrics.
arXiv Detail & Related papers (2021-03-11T16:14:56Z) - Learning Hybrid Control Barrier Functions from Data [66.37785052099423]
Motivated by the lack of systematic tools to obtain safe control laws for hybrid systems, we propose an optimization-based framework for learning certifiably safe control laws from data.
In particular, we assume a setting in which the system dynamics are known and in which data exhibiting safe system behavior is available.
arXiv Detail & Related papers (2020-11-08T23:55:02Z) - Towards robust sensing for Autonomous Vehicles: An adversarial
perspective [82.83630604517249]
It is of primary importance that the resulting decisions are robust to perturbations.
Adversarial perturbations are purposefully crafted alterations of the environment or of the sensory measurements.
A careful evaluation of the vulnerabilities of their sensing system(s) is necessary in order to build and deploy safer systems.
arXiv Detail & Related papers (2020-07-14T05:25:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.