Identification of Probability weighted ARX models with arbitrary domains
- URL: http://arxiv.org/abs/2009.13975v1
- Date: Tue, 29 Sep 2020 12:50:33 GMT
- Title: Identification of Probability weighted ARX models with arbitrary domains
- Authors: Alessandro Brusaferri and Matteo Matteucci and Stefano Spinelli
- Abstract summary: PieceWise Affine models guarantees universal approximation, local linearity and equivalence to other classes of hybrid system.
In this work, we focus on the identification of PieceWise Auto Regressive with eXogenous input models with arbitrary regions (NPWARX)
The architecture is conceived following the Mixture of Expert concept, developed within the machine learning field.
- Score: 75.91002178647165
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hybrid system identification is a key tool to achieve reliable models of
Cyber-Physical Systems from data. PieceWise Affine models guarantees universal
approximation, local linearity and equivalence to other classes of hybrid
system. Still, PWA identification is a challenging problem, requiring the
concurrent solution of regression and classification tasks. In this work, we
focus on the identification of PieceWise Auto Regressive with eXogenous input
models with arbitrary regions (NPWARX), thus not restricted to polyhedral
domains, and characterized by discontinuous maps. To this end, we propose a
method based on a probabilistic mixture model, where the discrete state is
represented through a multinomial distribution conditioned by the input
regressors. The architecture is conceived following the Mixture of Expert
concept, developed within the machine learning field. To achieve nonlinear
partitioning, we parametrize the discriminant function using a neural network.
Then, the parameters of both the ARX submodels and the classifier are
concurrently estimated by maximizing the likelihood of the overall model using
Expectation Maximization. The proposed method is demonstrated on a nonlinear
piece-wise problem with discontinuous maps.
Related papers
- DiffHybrid-UQ: Uncertainty Quantification for Differentiable Hybrid
Neural Modeling [4.76185521514135]
We introduce a novel method, DiffHybrid-UQ, for effective and efficient uncertainty propagation and estimation in hybrid neural differentiable models.
Specifically, our approach effectively discerns and quantifies both aleatoric uncertainties, arising from data noise, and epistemic uncertainties, resulting from model-form discrepancies and data sparsity.
arXiv Detail & Related papers (2023-12-30T07:40:47Z) - Generative Marginalization Models [21.971818180264943]
marginalization models (MAMs) are a new family of generative models for high-dimensional discrete data.
They offer scalable and flexible generative modeling by explicitly modeling all induced marginal distributions.
For energy-based training tasks, MAMs enable any-order generative modeling of high-dimensional problems beyond the scale of previous methods.
arXiv Detail & Related papers (2023-10-19T17:14:29Z) - Deep Learning Aided Laplace Based Bayesian Inference for Epidemiological
Systems [2.596903831934905]
We propose a hybrid approach where Laplace-based Bayesian inference is combined with an ANN architecture for obtaining approximations to the ODE trajectories.
The effectiveness of our proposed methods is demonstrated using an epidemiological system with non-analytical solutions, the Susceptible-Infectious-Removed (SIR) model for infectious diseases.
arXiv Detail & Related papers (2022-10-17T09:02:41Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Deep Variational Models for Collaborative Filtering-based Recommender
Systems [63.995130144110156]
Deep learning provides accurate collaborative filtering models to improve recommender system results.
Our proposed models apply the variational concept to injectity in the latent space of the deep architecture.
Results show the superiority of the proposed approach in scenarios where the variational enrichment exceeds the injected noise effect.
arXiv Detail & Related papers (2021-07-27T08:59:39Z) - Gaussian Process Latent Class Choice Models [7.992550355579791]
We present a non-parametric class of probabilistic machine learning within discrete choice models (DCMs)
The proposed model would assign individuals probabilistically to behaviorally homogeneous clusters (latent classes) using GPs.
The model is tested on two different mode choice applications and compared against different LCCM benchmarks.
arXiv Detail & Related papers (2021-01-28T19:56:42Z) - Shaping Deep Feature Space towards Gaussian Mixture for Visual
Classification [74.48695037007306]
We propose a Gaussian mixture (GM) loss function for deep neural networks for visual classification.
With a classification margin and a likelihood regularization, the GM loss facilitates both high classification performance and accurate modeling of the feature distribution.
The proposed model can be implemented easily and efficiently without using extra trainable parameters.
arXiv Detail & Related papers (2020-11-18T03:32:27Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Estimation of Switched Markov Polynomial NARX models [75.91002178647165]
We identify a class of models for hybrid dynamical systems characterized by nonlinear autoregressive (NARX) components.
The proposed approach is demonstrated on a SMNARX problem composed by three nonlinear sub-models with specific regressors.
arXiv Detail & Related papers (2020-09-29T15:00:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.