Deep Polynomial Chaos Expansion
- URL: http://arxiv.org/abs/2507.21273v1
- Date: Mon, 28 Jul 2025 18:59:46 GMT
- Title: Deep Polynomial Chaos Expansion
- Authors: Johannes Exenberger, Sascha Ranftl, Robert Peharz,
- Abstract summary: Polynomial chaos expansion (PCE) is a classical and widely used surrogate modeling technique.<n>DeepPCE is a deep generalization of PCE that scales effectively to high-dimensional input spaces.
- Score: 5.6189692698829115
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Polynomial chaos expansion (PCE) is a classical and widely used surrogate modeling technique in physical simulation and uncertainty quantification. By taking a linear combination of a set of basis polynomials - orthonormal with respect to the distribution of uncertain input parameters - PCE enables tractable inference of key statistical quantities, such as (conditional) means, variances, covariances, and Sobol sensitivity indices, which are essential for understanding the modeled system and identifying influential parameters and their interactions. As the number of basis functions grows exponentially with the number of parameters, PCE does not scale well to high-dimensional problems. We address this challenge by combining PCE with ideas from probabilistic circuits, resulting in the deep polynomial chaos expansion (DeepPCE) - a deep generalization of PCE that scales effectively to high-dimensional input spaces. DeepPCE achieves predictive performance comparable to that of multi-layer perceptrons (MLPs), while retaining PCE's ability to compute exact statistical inferences via simple forward passes.
Related papers
- Neural Chaos: A Spectral Stochastic Neural Operator [0.0]
Polynomial Chaos Expansion (PCE) is widely recognized as a to-go method for constructing varying solutions in both intrusive and non-intrusive ways.<n>We propose an algorithm that identifies neural network (NN) basis functions in a purely data-driven manner.<n>We demonstrate the effectiveness of the proposed scheme through several numerical examples.
arXiv Detail & Related papers (2025-02-17T14:30:46Z) - Physics-Informed Polynomial Chaos Expansions [7.5746822137722685]
This paper presents a novel methodology for the construction of physics-informed expansions (PCE)
A computationally efficient means for physically constrained PCE is proposed and compared to standard sparse PCE.
We show that the constrained PCEs can be easily applied for uncertainty through analytical post-processing.
arXiv Detail & Related papers (2023-09-04T16:16:34Z) - IPCC-TP: Utilizing Incremental Pearson Correlation Coefficient for Joint
Multi-Agent Trajectory Prediction [73.25645602768158]
IPCC-TP is a novel relevance-aware module based on Incremental Pearson Correlation Coefficient to improve multi-agent interaction modeling.
Our module can be conveniently embedded into existing multi-agent prediction methods to extend original motion distribution decoders.
arXiv Detail & Related papers (2023-03-01T15:16:56Z) - The sparse Polynomial Chaos expansion: a fully Bayesian approach with
joint priors on the coefficients and global selection of terms [0.0]
Polynomial chaos expansion (PCE) is a versatile tool widely used in uncertainty and machine learning.
It can overcome the curse of dimensionality very efficiently, but have to pay specific attention to their strategies of choosing training points.
In this study, we develop and evaluate a fully Bayesian approach to establish the PCE representation via joint shrinkage priors and Markov chain Monte Carlo.
arXiv Detail & Related papers (2022-04-12T19:00:00Z) - Parameterized Consistency Learning-based Deep Polynomial Chaos Neural
Network Method for Reliability Analysis in Aerospace Engineering [3.541245871465521]
Polynomial chaos expansion (PCE) is a powerful surrogate model reliability analysis method in aerospace engineering.
To alleviate this problem, this paper proposes a parameterized consistency learning-based deep chaos neural network (Deep PCNN) method.
The Deep PCNN method can significantly reduce the training data cost in constructing a high-order PCE model.
arXiv Detail & Related papers (2022-03-29T15:15:12Z) - HyperSPNs: Compact and Expressive Probabilistic Circuits [89.897635970366]
HyperSPNs is a new paradigm of generating the mixture weights of large PCs using a small-scale neural network.
We show the merits of our regularization strategy on two state-of-the-art PC families introduced in recent literature.
arXiv Detail & Related papers (2021-12-02T01:24:43Z) - Pseudo-Spherical Contrastive Divergence [119.28384561517292]
We propose pseudo-spherical contrastive divergence (PS-CD) to generalize maximum learning likelihood of energy-based models.
PS-CD avoids the intractable partition function and provides a generalized family of learning objectives.
arXiv Detail & Related papers (2021-11-01T09:17:15Z) - Mini-data-driven Deep Arbitrary Polynomial Chaos Expansion for
Uncertainty Quantification [9.586968666707529]
This paper proposes a deep arbitrary chaos expansion (Deep aPCE) method to improve the balance between surrogate model accuracy and training data cost.
Four numerical examples and an actual engineering problem are used to verify the effectiveness of the Deep aPCE method.
arXiv Detail & Related papers (2021-07-22T02:49:07Z) - Permutation Invariant Policy Optimization for Mean-Field Multi-Agent
Reinforcement Learning: A Principled Approach [128.62787284435007]
We propose the mean-field proximal policy optimization (MF-PPO) algorithm, at the core of which is a permutation-invariant actor-critic neural architecture.
We prove that MF-PPO attains the globally optimal policy at a sublinear rate of convergence.
In particular, we show that the inductive bias introduced by the permutation-invariant neural architecture enables MF-PPO to outperform existing competitors.
arXiv Detail & Related papers (2021-05-18T04:35:41Z) - FLIP: A flexible initializer for arbitrarily-sized parametrized quantum
circuits [105.54048699217668]
We propose a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters.
We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.
arXiv Detail & Related papers (2021-03-15T17:38:33Z) - Repulsive Mixture Models of Exponential Family PCA for Clustering [127.90219303669006]
The mixture extension of exponential family principal component analysis ( EPCA) was designed to encode much more structural information about data distribution than the traditional EPCA.
The traditional mixture of local EPCAs has the problem of model redundancy, i.e., overlaps among mixing components, which may cause ambiguity for data clustering.
In this paper, a repulsiveness-encouraging prior is introduced among mixing components and a diversified EPCA mixture (DEPCAM) model is developed in the Bayesian framework.
arXiv Detail & Related papers (2020-04-07T04:07:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.