Learning Physical Concepts in Cyber-Physical Systems: A Case Study
- URL: http://arxiv.org/abs/2111.14151v1
- Date: Sun, 28 Nov 2021 14:24:52 GMT
- Title: Learning Physical Concepts in Cyber-Physical Systems: A Case Study
- Authors: Henrik S. Steude and Alexander Windmann and Oliver Niggemann
- Abstract summary: We provide an overview of the current state of research regarding methods for learning physical concepts in time series data.
We also analyze the most important methods from the current state of the art using the example of a three-tank system.
- Score: 72.74318982275052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine Learning (ML) has achieved great successes in recent decades, both in
research and in practice. In Cyber-Physical Systems (CPS), ML can for example
be used to optimize systems, to detect anomalies or to identify root causes of
system failures. However, existing algorithms suffer from two major drawbacks:
(i) They are hard to interpret by human experts. (ii) Transferring results from
one systems to another (similar) system is often a challenge. Concept learning,
or Representation Learning (RepL), is a solution to both of these drawbacks;
mimicking the human solution approach to explain-ability and transfer-ability:
By learning general concepts such as physical quantities or system states, the
model becomes interpretable by humans. Furthermore concepts on this abstract
level can normally be applied to a wide range of different systems. Modern ML
methods are already widely used in CPS, but concept learning and transfer
learning are hardly used so far. In this paper, we provide an overview of the
current state of research regarding methods for learning physical concepts in
time series data, which is the primary form of sensor data of CPS. We also
analyze the most important methods from the current state of the art using the
example of a three-tank system. Based on these concrete implementations1, we
discuss the advantages and disadvantages of the methods and show for which
purpose and under which conditions they can be used.
Related papers
- Interpretable Meta-Learning of Physical Systems [4.343110120255532]
Recent meta-learning methods rely on black-box neural networks, resulting in high computational costs and limited interpretability.
We argue that multi-environment generalization can be achieved using a simpler learning model, with an affine structure with respect to the learning task.
We demonstrate the competitive generalization performance and the low computational cost of our method by comparing it to state-of-the-art algorithms on physical systems.
arXiv Detail & Related papers (2023-12-01T10:18:50Z) - STPA for Learning-Enabled Systems: A Survey and A New Practice [12.665507596261266]
Systems Theoretic Process Analysis (STPA) is a systematic approach for hazard analysis that has been used across many industrial sectors including transportation, energy, and defense.
The trend of using Machine Learning (ML) in safety-critical systems has led to the need of extendingSTPA to Learning-Enabled Systems (LESs)
We present a systematic survey of 31 papers, summarising them from five perspectives (attributes of concern, objects under study, modifications, derivatives and processes being modelled)
We introduce DeepSTPA, which enhances DeepSTPA from two aspects that are missing from the state-of-the-
arXiv Detail & Related papers (2023-02-21T10:43:51Z) - Deep learning applied to computational mechanics: A comprehensive
review, state of the art, and the classics [77.34726150561087]
Recent developments in artificial neural networks, particularly deep learning (DL), are reviewed in detail.
Both hybrid and pure machine learning (ML) methods are discussed.
History and limitations of AI are recounted and discussed, with particular attention at pointing out misstatements or misconceptions of the classics.
arXiv Detail & Related papers (2022-12-18T02:03:00Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - An Extensible Benchmark Suite for Learning to Simulate Physical Systems [60.249111272844374]
We introduce a set of benchmark problems to take a step towards unified benchmarks and evaluation protocols.
We propose four representative physical systems, as well as a collection of both widely used classical time-based and representative data-driven methods.
arXiv Detail & Related papers (2021-08-09T17:39:09Z) - Empirically Measuring Transfer Distance for System Design and Operation [2.9864637081333085]
We show that transfer learning algorithms have little, if any, examples from which to learn.
We consider the use of transfer distance in the design of machine rebuild procedures to allow for transferable prognostic models.
Practitioners can use the presented methodology to design and operate systems with consideration for the learning theoretic challenges faced by component learning systems.
arXiv Detail & Related papers (2021-07-02T16:45:58Z) - Model-Based Deep Learning [155.063817656602]
Signal processing, communications, and control have traditionally relied on classical statistical modeling techniques.
Deep neural networks (DNNs) use generic architectures which learn to operate from data, and demonstrate excellent performance.
We are interested in hybrid techniques that combine principled mathematical models with data-driven systems to benefit from the advantages of both approaches.
arXiv Detail & Related papers (2020-12-15T16:29:49Z) - A Survey on Large-scale Machine Learning [67.6997613600942]
Machine learning can provide deep insights into data, allowing machines to make high-quality predictions.
Most sophisticated machine learning approaches suffer from huge time costs when operating on large-scale data.
Large-scale Machine Learning aims to learn patterns from big data with comparable performance efficiently.
arXiv Detail & Related papers (2020-08-10T06:07:52Z) - Watch and learn -- a generalized approach for transferrable learning in
deep neural networks via physical principles [0.0]
We demonstrate an unsupervised learning approach that achieves fully transferrable learning for problems in statistical physics across different physical regimes.
By coupling a sequence model based on a recurrent neural network to an extensive deep neural network, we are able to learn the equilibrium probability distributions and inter-particle interaction models of classical statistical mechanical systems.
arXiv Detail & Related papers (2020-03-03T18:37:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.