Wave Physics-informed Matrix Factorizations
- URL: http://arxiv.org/abs/2312.13584v2
- Date: Sun, 31 Dec 2023 01:52:57 GMT
- Title: Wave Physics-informed Matrix Factorizations
- Authors: Harsha Vardhan Tetali, Joel B. Harley, Benjamin D. Haeffele
- Abstract summary: In many applications that involve a signal propagating through physical media, the dynamics of the signal must satisfy constraints imposed by the wave equation.
Here we propose a matrix factorization technique that decomposes the dynamics signal into a sum sum.
We establish theoretical connections between wave learning and filtering theory in signal processing.
- Score: 8.64018020390058
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the recent success of representation learning methods, which includes
deep learning as a special case, there has been considerable interest in
developing techniques that incorporate known physical constraints into the
learned representation. As one example, in many applications that involve a
signal propagating through physical media (e.g., optics, acoustics, fluid
dynamics, etc), it is known that the dynamics of the signal must satisfy
constraints imposed by the wave equation. Here we propose a matrix
factorization technique that decomposes such signals into a sum of components,
where each component is regularized to ensure that it {nearly} satisfies wave
equation constraints. Although our proposed formulation is non-convex, we prove
that our model can be efficiently solved to global optimality. Through this
line of work we establish theoretical connections between wave-informed
learning and filtering theory in signal processing. We further demonstrate the
application of this work on modal analysis problems commonly arising in
structural diagnostics and prognostics.
Related papers
- A Sampling Theory Perspective on Activations for Implicit Neural
Representations [73.6637608397055]
Implicit Neural Representations (INRs) have gained popularity for encoding signals as compact, differentiable entities.
We conduct a comprehensive analysis of these activations from a sampling theory perspective.
Our investigation reveals that sinc activations, previously unused in conjunction with INRs, are theoretically optimal for signal encoding.
arXiv Detail & Related papers (2024-02-08T05:52:45Z) - Harmonic (Quantum) Neural Networks [10.31053131199922]
Harmonic functions are abundant in nature, appearing in limiting cases of Maxwell's, Navier-Stokes equations, the heat and the wave equation.
Despite their ubiquity and relevance, there have been few attempts to incorporate inductive biases towards harmonic functions in machine learning contexts.
We show effective means of representing harmonic functions in neural networks and extend such results to quantum neural networks.
arXiv Detail & Related papers (2022-12-14T19:13:59Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Wave-Informed Matrix Factorization withGlobal Optimality Guarantees [8.89493507314525]
In many applications, the dynamics signal must satisfy constraints by wave equation.
We propose a matrix factorization technique that decomposes such signals into a sum of components.
We prove that our model can be efficiently solved to global optimality in time.
arXiv Detail & Related papers (2021-07-19T20:34:47Z) - Prediction of Ultrasonic Guided Wave Propagation in Solid-fluid and
their Interface under Uncertainty using Machine Learning [0.0]
We advance existing research by accounting for uncertainty in the material and geometric properties of a structure.
We develop an efficient algorithm that addresses the inherent complexity of solving the multiphysics problem under uncertainty.
The proposed approach provides an accurate prediction for the WpFSI problem in the presence of uncertainty.
arXiv Detail & Related papers (2021-03-30T01:05:14Z) - A Framework for Fluid Motion Estimation using a Constraint-Based
Refinement Approach [0.0]
We formulate a general framework for fluid motion estimation using a constraint-based refinement approach.
We demonstrate that for a particular choice of constraint, our results closely approximate the classical continuity equation-based method for fluid flow.
We also observe a surprising connection to the Cauchy-Riemann operator that diagonalizes the system leading to a diffusive phenomenon involving the divergence and the curl of the flow.
arXiv Detail & Related papers (2020-11-24T18:23:39Z) - QuTiP-BoFiN: A bosonic and fermionic numerical
hierarchical-equations-of-motion library with applications in
light-harvesting, quantum control, and single-molecule electronics [51.15339237964982]
"hierarchical equations of motion" (HEOM) is a powerful exact numerical approach to solve the dynamics.
It has been extended and applied to problems in solid-state physics, optics, single-molecule electronics, and biological physics.
We present a numerical library in Python, integrated with the powerful QuTiP platform, which implements the HEOM for both bosonic and fermionic environments.
arXiv Detail & Related papers (2020-10-21T07:54:56Z) - Implicit Neural Representations with Periodic Activation Functions [109.2353097792111]
Implicitly defined, continuous, differentiable signal representations parameterized by neural networks have emerged as a powerful paradigm.
We propose to leverage periodic activation functions for implicit neural representations and demonstrate that these networks, dubbed sinusoidal representation networks or Sirens, are ideally suited for representing complex natural signals and their derivatives.
We show how Sirens can be leveraged to solve challenging boundary value problems, such as particular Eikonal equations, the Poisson equation, and the Helmholtz and wave equations.
arXiv Detail & Related papers (2020-06-17T05:13:33Z) - Multilinear Compressive Learning with Prior Knowledge [106.12874293597754]
Multilinear Compressive Learning (MCL) framework combines Multilinear Compressive Sensing and Machine Learning into an end-to-end system.
Key idea behind MCL is the assumption of the existence of a tensor subspace which can capture the essential features from the signal for the downstream learning task.
In this paper, we propose a novel solution to address both of the aforementioned requirements, i.e., How to find those tensor subspaces in which the signals of interest are highly separable?
arXiv Detail & Related papers (2020-02-17T19:06:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.