Rational kernel-based interpolation for complex-valued frequency response functions
- URL: http://arxiv.org/abs/2307.13484v3
- Date: Sat, 23 Nov 2024 20:37:13 GMT
- Title: Rational kernel-based interpolation for complex-valued frequency response functions
- Authors: Julien Bect, Niklas Georg, Ulrich Römer, Sebastian Schöps,
- Abstract summary: This work is concerned with the kernel-based approximation of a complex-valued function from data.
We introduce new Hilbert kernel spaces of complex-valued functions, and formulate the problem of complex-valued with a kernel pair as minimum norm in these spaces.
Numerical results on examples from different fields, including electromagnetics and acoustic examples, illustrate the performance of the method.
- Score: 0.0
- License:
- Abstract: This work is concerned with the kernel-based approximation of a complex-valued function from data, where the frequency response function of a partial differential equation in the frequency domain is of particular interest. In this setting, kernel methods are employed more and more frequently, however, standard kernels do not perform well. Moreover, the role and mathematical implications of the underlying pair of kernels, which arises naturally in the complex-valued case, remain to be addressed. We introduce new reproducing kernel Hilbert spaces of complex-valued functions, and formulate the problem of complex-valued interpolation with a kernel pair as minimum norm interpolation in these spaces. Moreover, we combine the interpolant with a low-order rational function, where the order is adaptively selected based on a new model selection criterion. Numerical results on examples from different fields, including electromagnetics and acoustic examples, illustrate the performance of the method, also in comparison to available rational approximation methods.
Related papers
- Optimal Kernel Choice for Score Function-based Causal Discovery [92.65034439889872]
We propose a kernel selection method within the generalized score function that automatically selects the optimal kernel that best fits the data.
We conduct experiments on both synthetic data and real-world benchmarks, and the results demonstrate that our proposed method outperforms kernel selection methods.
arXiv Detail & Related papers (2024-07-14T09:32:20Z) - Enhancing Solutions for Complex PDEs: Introducing Complementary Convolution and Equivariant Attention in Fourier Neural Operators [17.91230192726962]
We propose a novel hierarchical Fourier neural operator along with convolution-residual layers and attention mechanisms to solve complex PDEs.
We find that the proposed method achieves superior performance in these PDE benchmarks, especially for equations characterized by rapid coefficient variations.
arXiv Detail & Related papers (2023-11-21T11:04:13Z) - Non-separable Covariance Kernels for Spatiotemporal Gaussian Processes
based on a Hybrid Spectral Method and the Harmonic Oscillator [0.0]
We present a hybrid spectral approach for generating covariance kernels based on physical arguments.
We derive explicit relations for the covariance kernels in the three oscillator regimes (underdamping, critical damping, overdamping) and investigate their properties.
arXiv Detail & Related papers (2023-02-19T14:12:48Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Improved Random Features for Dot Product Kernels [12.321353062415701]
We make several novel contributions for improving the efficiency of random feature approximations for dot product kernels.
We show empirically that the use of complex features can significantly reduce the variances of these approximations.
We develop a data-driven optimization approach to improve random feature approximations for general dot product kernels.
arXiv Detail & Related papers (2022-01-21T14:16:56Z) - Hybrid Random Features [60.116392415715275]
We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs)
HRFs automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest.
arXiv Detail & Related papers (2021-10-08T20:22:59Z) - Spectrum Gaussian Processes Based On Tunable Basis Functions [15.088239458693003]
We introduce a novel basis function, which is tunable, local and bounded, to approximate the kernel function in the Gaussian process.
We conduct extensive experiments on open-source datasets to testify its performance.
arXiv Detail & Related papers (2021-07-14T03:51:24Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Learning interaction kernels in mean-field equations of 1st-order
systems of interacting particles [1.776746672434207]
We introduce a nonparametric algorithm to learn interaction kernels of mean-field equations for 1st-order systems of interacting particles.
By at least squares with regularization, the algorithm learns the kernel on data-adaptive hypothesis spaces efficiently.
arXiv Detail & Related papers (2020-10-29T15:37:17Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.