Disentangled Representation Learning for Parametric Partial Differential Equations
- URL: http://arxiv.org/abs/2410.02136v1
- Date: Thu, 3 Oct 2024 01:40:39 GMT
- Title: Disentangled Representation Learning for Parametric Partial Differential Equations
- Authors: Ning Liu, Lu Zhang, Tian Gao, Yue Yu,
- Abstract summary: We propose a new paradigm for learning disentangled representations from neural operator parameters.
DisentangO is a novel hyper-neural operator architecture designed to unveil and disentangle the latent physical factors of variation embedded within the black-box neural operator parameters.
We show that DisentangO effectively extracts meaningful and interpretable latent features, bridging the divide between predictive performance and physical understanding in neural operator frameworks.
- Score: 31.240283037552427
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural operators (NOs) have demonstrated remarkable success in learning mappings between function spaces, serving as efficient approximators for the forward solutions of complex physical systems governed by partial differential equations (PDEs). However, while effective as black-box solvers, they offer limited insight into the underlying physical mechanism, due to the lack of interpretable representations of the physical parameters that drive the system. To tackle this challenge, we propose a new paradigm for learning disentangled representations from neural operator parameters, thereby effectively solving an inverse problem. Specifically, we introduce DisentangO, a novel hyper-neural operator architecture designed to unveil and disentangle the latent physical factors of variation embedded within the black-box neural operator parameters. At the core of DisentangO is a multi-task neural operator architecture that distills the varying parameters of the governing PDE through a task-wise adaptive layer, coupled with a hierarchical variational autoencoder that disentangles these variations into identifiable latent factors. By learning these disentangled representations, our model not only enhances physical interpretability but also enables more robust generalization across diverse physical systems. Empirical evaluations across supervised, semi-supervised, and unsupervised learning contexts show that DisentangO effectively extracts meaningful and interpretable latent features, bridging the divide between predictive performance and physical understanding in neural operator frameworks.
Related papers
- Learning Physical Operators using Neural Operators [10.57578521926415]
We train neural operators to learn individual non-linear physical operators while approximating linear operators with fixed finite-difference convolutions.<n>We formulate the modelling task as a neural ordinary differential equation (ODE) where these learned operators constitute the right-hand side.<n>Our approach achieves better convergence and superior performance when generalising to unseen physics.
arXiv Detail & Related papers (2026-02-26T15:27:14Z) - Geometric Neural Operators via Lie Group-Constrained Latent Dynamics [14.152015935335358]
We show that our method effectively lowers the relative prediction error by 30-50% at the cost of 2.26% of parameter increase.<n>The results show that our approach provides a scalable solution for improving long-term prediction fidelity.
arXiv Detail & Related papers (2026-02-18T06:17:47Z) - Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations [65.80144621950981]
We build on Wiener chaos expansions (WCE) to design neural operator (NO) architectures for SPDEs and SDEs.<n>We show that WCE-based neural operators provide a practical and scalable way to learn SDE/SPDE solution operators.
arXiv Detail & Related papers (2026-01-03T00:59:25Z) - How deep is your network? Deep vs. shallow learning of transfer operators [0.4473327661758546]
We propose a randomized neural network approach called RaNNDy for learning transfer operators and their spectral decompositions from data.<n>The main advantage is that without a noticeable reduction in accuracy, this approach significantly reduces the training time and resources.<n>We present results for different dynamical operators, including Koopman and Perron-Frobenius operators, which have important applications in analyzing the behavior of complex dynamical systems.
arXiv Detail & Related papers (2025-09-24T09:38:42Z) - Sparse Autoencoder Neural Operators: Model Recovery in Function Spaces [75.45093712182624]
We introduce a framework that extends sparse autoencoders (SAEs) to lifted spaces and infinite-dimensional function spaces, enabling mechanistic interpretability of large neural operators (NO)<n>We compare the inference and training dynamics of SAEs, lifted-SAE, and SAE neural operators.<n>We highlight how lifting and operator modules introduce beneficial inductive biases, enabling faster recovery, improved recovery of smooth concepts, and robust inference across varying resolutions, a property unique to neural operators.
arXiv Detail & Related papers (2025-09-03T21:57:03Z) - Causal Operator Discovery in Partial Differential Equations via Counterfactual Physics-Informed Neural Networks [0.0]
We develop a principled framework for discovering causal structure in partial differential equations (PDEs) using physics-informed neural networks and counterfactual minimizations.<n>We validate the framework on both synthetic and real-world datasets across climate dynamics, tumor diffusion, and ocean flows.<n>This work positions causal PDE discovery as a tractable and interpretable inference task grounded in structural causal models and variational residual analysis.
arXiv Detail & Related papers (2025-06-25T07:15:42Z) - A Unified Framework for Simultaneous Parameter and Function Discovery in Differential Equations [0.0]
Inverse problems involving differential equations often require identifying unknown parameters or functions from data.<n>Existing approaches, such as Physics-Informed Neural Networks (PINNs), are effective at isolating either parameters or functions but can face challenges when applied simultaneously due to solution non-uniqueness.<n>We introduce a framework that addresses these limitations by establishing conditions under which unique solutions can be guaranteed.
arXiv Detail & Related papers (2025-05-22T17:56:38Z) - Invertible Koopman neural operator for data-driven modeling of partial differential equations [15.007354910932039]
Invertible Koopman Neural Operator (IKNO) is a novel data-driven modeling approach inspired by the Koopman operator theory and neural operator.
IKNO parameterizes observable function and its inverse simultaneously under the same learnable parameters.
arXiv Detail & Related papers (2025-03-25T14:43:53Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - DeltaPhi: Physical States Residual Learning for Neural Operators in Data-Limited PDE Solving [54.605760146540234]
DeltaPhi is a novel learning framework that transforms the PDE solving task from learning direct input-output mappings to learning the residuals between similar physical states.<n>Extensive experiments demonstrate consistent and significant improvements across diverse physical systems.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Neural Parameter Regression for Explicit Representations of PDE Solution Operators [22.355460388065964]
We introduce Neural Regression (NPR), a novel framework specifically developed for learning solution operators in Partial Differential Equations (PDEs)
NPR employs Physics-Informed Neural Network (PINN, Raissi et al., 2021) techniques to regress Neural Network (NN) parameters.
The framework shows remarkable adaptability to new initial and boundary conditions, allowing for rapid fine-tuning and inference.
arXiv Detail & Related papers (2024-03-19T14:30:56Z) - Hierarchical Invariance for Robust and Interpretable Vision Tasks at Larger Scales [54.78115855552886]
We show how to construct over-complete invariants with a Convolutional Neural Networks (CNN)-like hierarchical architecture.
With the over-completeness, discriminative features w.r.t. the task can be adaptively formed in a Neural Architecture Search (NAS)-like manner.
For robust and interpretable vision tasks at larger scales, hierarchical invariant representation can be considered as an effective alternative to traditional CNN and invariants.
arXiv Detail & Related papers (2024-02-23T16:50:07Z) - Manipulating Feature Visualizations with Gradient Slingshots [54.31109240020007]
We introduce a novel method for manipulating Feature Visualization (FV) without significantly impacting the model's decision-making process.
We evaluate the effectiveness of our method on several neural network models and demonstrate its capabilities to hide the functionality of arbitrarily chosen neurons.
arXiv Detail & Related papers (2024-01-11T18:57:17Z) - Deciphering and integrating invariants for neural operator learning with
various physical mechanisms [22.508244510177683]
We propose Physical Invariant Attention Neural Operator (PIANO) to decipher and integrate the physical invariants (PI) for operator learning from the PDE series with various physical mechanisms.
Compared to existing techniques, PIANO can reduce the relative error by 13.6%-82.2% on PDE forecasting tasks across varying coefficients, forces, or boundary conditions.
arXiv Detail & Related papers (2023-11-24T09:03:52Z) - Interpretable Neural PDE Solvers using Symbolic Frameworks [0.0]
Partial differential equations (PDEs) are ubiquitous in the world around us, modelling phenomena from heat and sound to quantum systems.
Recent advances in deep learning have resulted in the development of powerful neural solvers.
However, a significant challenge remains in their interpretability.
arXiv Detail & Related papers (2023-10-31T13:56:25Z) - Neural Implicit Representations for Physical Parameter Inference from a Single Video [49.766574469284485]
We propose to combine neural implicit representations for appearance modeling with neural ordinary differential equations (ODEs) for modelling physical phenomena.
Our proposed model combines several unique advantages: (i) Contrary to existing approaches that require large training datasets, we are able to identify physical parameters from only a single video.
The use of neural implicit representations enables the processing of high-resolution videos and the synthesis of photo-realistic images.
arXiv Detail & Related papers (2022-04-29T11:55:35Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.