A Tree Adjoining Grammar Representation for Models Of Stochastic
Dynamical Systems
- URL: http://arxiv.org/abs/2001.05320v2
- Date: Mon, 25 May 2020 13:24:49 GMT
- Title: A Tree Adjoining Grammar Representation for Models Of Stochastic
Dynamical Systems
- Authors: Dhruv Khandelwal, Maarten Schoukens and Roland T\'oth
- Abstract summary: We propose a Tree Adjoining Grammar (TAG) for estimating model structure and complexity.
TAGs can be used to generate models in an Evolutionary Algorithm (EA) framework while imposing desirable structural constraints.
We demonstrate that TAGs can be easily extended to more general model classes, such as the non-linear Box-Jenkins model class.
- Score: 19.0709328061569
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Model structure and complexity selection remains a challenging problem in
system identification, especially for parametric non-linear models. Many
Evolutionary Algorithm (EA) based methods have been proposed in the literature
for estimating model structure and complexity. In most cases, the proposed
methods are devised for estimating structure and complexity within a specified
model class and hence these methods do not extend to other model structures
without significant changes. In this paper, we propose a Tree Adjoining Grammar
(TAG) for stochastic parametric models. TAGs can be used to generate models in
an EA framework while imposing desirable structural constraints and
incorporating prior knowledge. In this paper, we propose a TAG that can
systematically generate models ranging from FIRs to polynomial NARMAX models.
Furthermore, we demonstrate that TAGs can be easily extended to more general
model classes, such as the non-linear Box-Jenkins model class, enabling the
realization of flexible and automatic model structure and complexity selection
via EA.
Related papers
- Learnable & Interpretable Model Combination in Dynamic Systems Modeling [0.0]
We discuss which types of models are usually combined and propose a model interface that is capable of expressing a variety of mixed equation based models.
We propose a new wildcard topology, that is capable of describing the generic connection between two combined models in an easy to interpret fashion.
The contributions of this paper are highlighted at a proof of concept: Different connection topologies between two models are learned, interpreted and compared.
arXiv Detail & Related papers (2024-06-12T11:17:11Z) - Language Model Cascades [72.18809575261498]
Repeated interactions at test-time with a single model, or the composition of multiple models together, further expands capabilities.
Cases with control flow and dynamic structure require techniques from probabilistic programming.
We formalize several existing techniques from this perspective, including scratchpads / chain of thought, verifiers, STaR, selection-inference, and tool use.
arXiv Detail & Related papers (2022-07-21T07:35:18Z) - Indeterminacy in Latent Variable Models: Characterization and Strong
Identifiability [3.959606869996233]
We construct a theoretical framework for analyzing the indeterminacies of latent variable models.
We then investigate how we might specify strongly identifiable latent variable models.
arXiv Detail & Related papers (2022-06-02T00:01:27Z) - Re-parameterizing Your Optimizers rather than Architectures [119.08740698936633]
We propose a novel paradigm of incorporating model-specific prior knowledge into Structurals and using them to train generic (simple) models.
As an implementation, we propose a novel methodology to add prior knowledge by modifying the gradients according to a set of model-specific hyper- parameters.
For a simple model trained with a Repr, we focus on a VGG-style plain model and showcase that such a simple model trained with a Repr, which is referred to as Rep-VGG, performs on par with the recent well-designed models.
arXiv Detail & Related papers (2022-05-30T16:55:59Z) - On generative models as the basis for digital twins [0.0]
A framework is proposed for generative models as a basis for digital twins or mirrors of structures.
The proposal is based on the premise that deterministic models cannot account for the uncertainty present in most structural modelling applications.
arXiv Detail & Related papers (2022-03-08T20:34:56Z) - Low-Rank Constraints for Fast Inference in Structured Models [110.38427965904266]
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.
Experiments with neural parameterized structured models for language modeling, polyphonic music modeling, unsupervised grammar induction, and video modeling show that our approach matches the accuracy of standard models at large state spaces.
arXiv Detail & Related papers (2022-01-08T00:47:50Z) - Meta-Model Structure Selection: Building Polynomial NARX Model for
Regression and Classification [0.0]
This work presents a new meta-heuristic approach to select the structure of NARX models for regression and classification problems.
The robustness of the new algorithm is tested on several simulated and experimental system with different nonlinear characteristics.
arXiv Detail & Related papers (2021-09-21T02:05:40Z) - Model-agnostic multi-objective approach for the evolutionary discovery
of mathematical models [55.41644538483948]
In modern data science, it is more interesting to understand the properties of the model, which parts could be replaced to obtain better results.
We use multi-objective evolutionary optimization for composite data-driven model learning to obtain the algorithm's desired properties.
arXiv Detail & Related papers (2021-07-07T11:17:09Z) - Bilinear Classes: A Structural Framework for Provable Generalization in
RL [119.42509700822484]
Bilinear Classes is a new structural framework which permits generalization in reinforcement learning.
The framework incorporates nearly all existing models in which a sample complexity is achievable.
Our main result provides an RL algorithm which has sample complexity for Bilinear Classes.
arXiv Detail & Related papers (2021-03-19T16:34:20Z) - Struct-MMSB: Mixed Membership Stochastic Blockmodels with Interpretable
Structured Priors [13.712395104755783]
Mixed membership blockmodel (MMSB) is a popular framework for community detection and network generation.
We present a flexible MMSB model, textitStruct-MMSB, that uses a recently developed statistical relational learning model, hinge-loss Markov random fields (HL-MRFs)
Our model is capable of learning latent characteristics in real-world networks via meaningful latent variables encoded as a complex combination of observed features and membership distributions.
arXiv Detail & Related papers (2020-02-21T19:32:32Z) - Learning Gaussian Graphical Models via Multiplicative Weights [54.252053139374205]
We adapt an algorithm of Klivans and Meka based on the method of multiplicative weight updates.
The algorithm enjoys a sample complexity bound that is qualitatively similar to others in the literature.
It has a low runtime $O(mp2)$ in the case of $m$ samples and $p$ nodes, and can trivially be implemented in an online manner.
arXiv Detail & Related papers (2020-02-20T10:50:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.