SOTER on ROS: A Run-Time Assurance Framework on the Robot Operating
System
- URL: http://arxiv.org/abs/2008.09707v1
- Date: Fri, 21 Aug 2020 22:48:26 GMT
- Title: SOTER on ROS: A Run-Time Assurance Framework on the Robot Operating
System
- Authors: Sumukh Shivakumar, Hazem Torfah, Ankush Desai, Sanjit A. Seshia
- Abstract summary: SOTER is a run-time assurance framework for building safe distributed mobile robotic systems.
We show that SOTER enabled systems ensure safety, even when using unknown and untrusted components.
- Score: 5.358161704743753
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present an implementation of SOTER, a run-time assurance framework for
building safe distributed mobile robotic (DMR) systems, on top of the Robot
Operating System (ROS). The safety of DMR systems cannot always be guaranteed
at design time, especially when complex, off-the-shelf components are used that
cannot be verified easily. SOTER addresses this by providing a language-based
approach for run-time assurance for DMR systems. SOTER implements the reactive
robotic software using the language P, a domain-specific language designed for
implementing asynchronous event-driven systems, along with an integrated
run-time assurance system that allows programmers to use unfortified components
but still provide safety guarantees. We describe an implementation of SOTER for
ROS and demonstrate its efficacy using a multi-robot surveillance case study,
with multiple run-time assurance modules. Through rigorous simulation, we show
that SOTER enabled systems ensure safety, even when using unknown and untrusted
components.
Related papers
- Asynchronous Tool Usage for Real-Time Agents [61.3041983544042]
We introduce asynchronous AI agents capable of parallel processing and real-time tool-use.
Our key contribution is an event-driven finite-state machine architecture for agent execution and prompting.
This work presents both a conceptual framework and practical tools for creating AI agents capable of fluid, multitasking interactions.
arXiv Detail & Related papers (2024-10-28T23:57:19Z) - ROS-LLM: A ROS framework for embodied AI with task feedback and structured reasoning [74.58666091522198]
We present a framework for intuitive robot programming by non-experts.
We leverage natural language prompts and contextual information from the Robot Operating System (ROS)
Our system integrates large language models (LLMs), enabling non-experts to articulate task requirements to the system through a chat interface.
arXiv Detail & Related papers (2024-06-28T08:28:38Z) - Learning Run-time Safety Monitors for Machine Learning Components [8.022333445774382]
This paper introduces a process for creating safety monitors for machine learning components through the use of degraded datasets and machine learning.
The safety monitor that is created is deployed to the AS in parallel to the ML component to provide a prediction of the safety risk associated with the model output.
arXiv Detail & Related papers (2024-06-23T21:25:06Z) - MMRNet: Improving Reliability for Multimodal Object Detection and
Segmentation for Bin Picking via Multimodal Redundancy [68.7563053122698]
We propose a reliable object detection and segmentation system with MultiModal Redundancy (MMRNet)
This is the first system that introduces the concept of multimodal redundancy to address sensor failure issues during deployment.
We present a new label-free multi-modal consistency (MC) score that utilizes the output from all modalities to measure the overall system output reliability and uncertainty.
arXiv Detail & Related papers (2022-10-19T19:15:07Z) - Monitoring ROS2: from Requirements to Autonomous Robots [58.720142291102135]
This paper provides an overview of a formal approach to generating runtime monitors for autonomous robots from requirements written in a structured natural language.
Our approach integrates the Formal Requirement Elicitation Tool (FRET) with Copilot, a runtime verification framework, through the Ogma integration tool.
arXiv Detail & Related papers (2022-09-28T12:19:13Z) - Recursively Feasible Probabilistic Safe Online Learning with Control Barrier Functions [60.26921219698514]
We introduce a model-uncertainty-aware reformulation of CBF-based safety-critical controllers.
We then present the pointwise feasibility conditions of the resulting safety controller.
We use these conditions to devise an event-triggered online data collection strategy.
arXiv Detail & Related papers (2022-08-23T05:02:09Z) - A Compositional Approach to Verifying Modular Robotic Systems [1.385411134620987]
This paper describes a compositional approach to specifying the nodes in robotic systems built using the Robotic Operating System (ROS)
We introduce inference rules that facilitate the composition of these node-level contracts to derive system-level properties.
We also present a novel Domain-Specific Language, the ROS Contract Language, which captures a node's FOL specification and links this contract to its implementation.
arXiv Detail & Related papers (2022-08-10T18:01:40Z) - An Empirical Analysis of the Use of Real-Time Reachability for the
Safety Assurance of Autonomous Vehicles [7.1169864450668845]
We propose using a real-time reachability algorithm for the implementation of the simplex architecture to assure the safety of a 1/10 scale open source autonomous vehicle platform.
In our approach, the need to analyze an underlying controller is abstracted away, instead focusing on the effects of the controller's decisions on the system's future states.
arXiv Detail & Related papers (2022-05-03T11:12:29Z) - Multi Agent System for Machine Learning Under Uncertainty in Cyber
Physical Manufacturing System [78.60415450507706]
Recent advancements in predictive machine learning has led to its application in various use cases in manufacturing.
Most research focused on maximising predictive accuracy without addressing the uncertainty associated with it.
In this paper, we determine the sources of uncertainty in machine learning and establish the success criteria of a machine learning system to function well under uncertainty.
arXiv Detail & Related papers (2021-07-28T10:28:05Z) - Guidance on the Assurance of Machine Learning in Autonomous Systems
(AMLAS) [16.579772998870233]
We introduce a methodology for the Assurance of Machine Learning for use in Autonomous Systems (AMLAS)
AMLAS comprises a set of safety case patterns and a process for integrating safety assurance into the development of ML components.
arXiv Detail & Related papers (2021-02-02T15:41:57Z) - Runtime Safety Assurance Using Reinforcement Learning [37.61747231296097]
This paper aims to design a meta-controller capable of identifying unsafe situations with high accuracy.
We frame the design of RTSA with the Markov decision process (MDP) and use reinforcement learning (RL) to solve it.
arXiv Detail & Related papers (2020-10-20T20:54:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.