A new perspective on Bayesian Operational Modal Analysis
- URL: http://arxiv.org/abs/2408.08664v2
- Date: Mon, 19 Aug 2024 12:20:26 GMT
- Title: A new perspective on Bayesian Operational Modal Analysis
- Authors: Brandon J. O'Connell, Max D. Champneys, Timothy J. Rogers,
- Abstract summary: In this article, a new perspective on Bayesian OMA is proposed: a Bayesian subspace identification (SSI) algorithm.
Two case studies are explored: the first is benchmark study using data from a simulated, multi degree-of-freedom, linear system.
It is observed that the posterior distributions with mean values coinciding with the natural frequencies exhibit lower variance than values situated away from the natural frequencies.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the field of operational modal analysis (OMA), obtained modal information is frequently used to assess the current state of aerospace, mechanical, offshore and civil structures. However, the stochasticity of operational systems and the lack of forcing information can lead to inconsistent results. Quantifying the uncertainty of the recovered modal parameters through OMA is therefore of significant value. In this article, a new perspective on Bayesian OMA is proposed: a Bayesian stochastic subspace identification (SSI) algorithm. Distinct from existing approaches to Bayesian OMA, a hierarchical probabilistic model is embedded at the core of covariance-driven SSI. Through substitution of canonical correlation analysis with a Bayesian equivalent, posterior distributions over the modal properties are obtained. Two inference schemes are presented for the proposed Bayesian formulation: Markov Chain Monte Carlo and variational Bayes. Two case studies are then explored. The first is benchmark study using data from a simulated, multi degree-of-freedom, linear system. Following application of Bayesian SSI, it is shown that the same posterior is targeted and recovered by both inference schemes, with good agreement between the posterior mean and the conventional SSI result. The second study applies the variational form to data obtained from an in-service structure: The Z24 bridge. The results of this study are presented at single model orders, and then using a stabilisation diagram. The recovered posterior uncertainty is presented and compared to the classic SSI result. It is observed that the posterior distributions with mean values coinciding with the natural frequencies exhibit lower variance than values situated away from the natural frequencies.
Related papers
- Dynamical System Identification, Model Selection and Model Uncertainty Quantification by Bayesian Inference [0.8388591755871735]
This study presents a Bayesian maximum textitaposteriori (MAP) framework for dynamical system identification from time-series data.
arXiv Detail & Related papers (2024-01-30T12:16:52Z) - Learning to solve Bayesian inverse problems: An amortized variational inference approach using Gaussian and Flow guides [0.0]
We develop a methodology that enables real-time inference by learning the Bayesian inverse map, i.e., the map from data to posteriors.
Our approach provides the posterior distribution for a given observation just at the cost of a forward pass of the neural network.
arXiv Detail & Related papers (2023-05-31T16:25:07Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Adversarial Bayesian Simulation [0.9137554315375922]
We bridge approximate Bayesian computation (ABC) with deep neural implicit samplers based on adversarial networks (GANs) and adversarial variational Bayes.
We develop a Bayesian GAN that directly targets the posterior by solving an adversarial optimization problem.
We show that the typical total variation distance between the true and approximate posteriors converges to zero for certain neural network generators and discriminators.
arXiv Detail & Related papers (2022-08-25T14:18:39Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - DiBS: Differentiable Bayesian Structure Learning [38.01659425023988]
We propose a general, fully differentiable framework for Bayesian structure learning (DiBS)
DiBS operates in the continuous space of a latent probabilistic graph representation.
Contrary to existing work, DiBS is agnostic to the form of the local conditional distributions.
arXiv Detail & Related papers (2021-05-25T11:23:08Z) - A comprehensive comparative evaluation and analysis of Distributional
Semantic Models [61.41800660636555]
We perform a comprehensive evaluation of type distributional vectors, either produced by static DSMs or obtained by averaging the contextualized vectors generated by BERT.
The results show that the alleged superiority of predict based models is more apparent than real, and surely not ubiquitous.
We borrow from cognitive neuroscience the methodology of Representational Similarity Analysis (RSA) to inspect the semantic spaces generated by distributional models.
arXiv Detail & Related papers (2021-05-20T15:18:06Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z) - Nonparametric Bayesian volatility learning under microstructure noise [2.812395851874055]
We study the problem of learning the volatility under market microstructure noise.
Specifically, we consider noisy discrete time observations from a differential equation.
We develop a novel computational method to learn the diffusion coefficient of the equation.
arXiv Detail & Related papers (2018-05-15T07:32:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.