General Fourier Feature Physics-Informed Extreme Learning Machine (GFF-PIELM) for High-Frequency PDEs
- URL: http://arxiv.org/abs/2510.12293v1
- Date: Tue, 14 Oct 2025 08:55:57 GMT
- Title: General Fourier Feature Physics-Informed Extreme Learning Machine (GFF-PIELM) for High-Frequency PDEs
- Authors: Fei Ren, Sifan Wang, Pei-Zhi Zhuang, Hai-Sui Yu, He Yang,
- Abstract summary: We propose a general Fourier feature physics-informed extreme learning machine (GFF-PIELM)<n>GFF-PIELM not only retains the high accuracy, efficiency, and simplicity of the PIELM framework but also inherits the ability of FFMs to effectively handle high-frequency problems.<n>We carry out five case studies with a total of ten numerical examples to highlight the feasibility and validity of the proposed GFF-PIELM.
- Score: 4.652567513453756
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conventional physics-informed extreme learning machine (PIELM) often faces challenges in solving partial differential equations (PDEs) involving high-frequency and variable-frequency behaviors. To address these challenges, we propose a general Fourier feature physics-informed extreme learning machine (GFF-PIELM). We demonstrate that directly concatenating multiple Fourier feature mappings (FFMs) and an extreme learning machine (ELM) network makes it difficult to determine frequency-related hyperparameters. Fortunately, we find an alternative to establish the GFF-PIELM in three main steps. First, we integrate a variation of FFM into ELM as the Fourier-based activation function, so there is still one hidden layer in the GFF-PIELM framework. Second, we assign a set of frequency coefficients to the hidden neurons, which enables ELM network to capture diverse frequency components of target solutions. Finally, we develop an innovative, straightforward initialization method for these hyperparameters by monitoring the distribution of ELM output weights. GFF-PIELM not only retains the high accuracy, efficiency, and simplicity of the PIELM framework but also inherits the ability of FFMs to effectively handle high-frequency problems. We carry out five case studies with a total of ten numerical examples to highlight the feasibility and validity of the proposed GFF-PIELM, involving high frequency, variable frequency, multi-scale behaviour, irregular boundary and inverse problems. Compared to conventional PIELM, the GFF-PIELM approach significantly improves predictive accuracy without additional cost in training time and architecture complexity. Our results confirm that that PIELM can be extended to solve high-frequency and variable-frequency PDEs with high accuracy, and our initialization strategy may further inspire advances in other physics-informed machine learning (PIML) frameworks.
Related papers
- FAIM: Frequency-Aware Interactive Mamba for Time Series Classification [87.84511960413715]
Time series classification (TSC) is crucial in numerous real-world applications, such as environmental monitoring, medical diagnosis, and posture recognition.<n>We propose FAIM, a lightweight Frequency-Aware Interactive Mamba model.<n>We show that FAIM consistently outperforms existing state-of-the-art (SOTA) methods, achieving a superior trade-off between accuracy and efficiency.
arXiv Detail & Related papers (2025-11-26T08:36:33Z) - F-Adapter: Frequency-Adaptive Parameter-Efficient Fine-Tuning in Scientific Machine Learning [28.598598268071587]
We conduct the first systematic study of PEFT for pre-trained Large Operator Models (LOMs)<n>We observe that the widely used Low-Rank Adaptation (LoRA) yields markedly poorer performance on LOMs than Adapter tuning.<n>Motivated by the robust empirical gains of adapters and by our theoretical characterization of PDE solutions as spectrally sparse, we introduce Frequency-Adaptive Adapter (F-Adapter)
arXiv Detail & Related papers (2025-09-27T08:05:28Z) - PDEfuncta: Spectrally-Aware Neural Representation for PDE Solution Modeling [26.573976229483442]
We propose a novel modulation technique that injects high-frequency information at each layer of the implicit neural representation.<n>This enables compact and accurate representation of multiple solution fields using low-dimensional latent vectors.<n>We also introduce PDEfuncta, a meta-learning framework designed to learn multi-modal solution fields and support generalization to new tasks.
arXiv Detail & Related papers (2025-06-15T09:41:25Z) - To Use or Not to Use a Universal Force Field [1.25431689228423]
Machine learning force fields (MLFFs) have emerged as powerful tools for molecular dynamics (MD) simulations.<n>This Perspective evaluates the viability of universal MLFFs for simulating complex materials systems.
arXiv Detail & Related papers (2025-03-11T09:23:01Z) - Multi-frequency wavefield solutions for variable velocity models using meta-learning enhanced low-rank physics-informed neural network [3.069335774032178]
Physics-informed neural networks (PINNs) face significant challenges in modeling multi-frequency wavefields in complex velocity models.<n>We propose Meta-LRPINN, a novel framework that combines low-rank parameterization with meta-learning and frequency embedding.<n> Numerical experiments show that Meta-LRPINN achieves much fast convergence speed and much high accuracy compared to baseline methods.
arXiv Detail & Related papers (2025-02-02T20:12:39Z) - Arbitrarily-Conditioned Multi-Functional Diffusion for Multi-Physics Emulation [17.67789938326378]
Arbitrarily-Conditioned Multi-Functional Diffusion (ACM-FD) is a versatile probabilistic surrogate model for multi-physics emulation.<n>ACM-FD can perform a wide range of tasks within a single framework, including forward prediction, various inverse problems, and simulating data for entire systems or subsets of quantities conditioned on others.<n>We propose a random-mask based, zero-regularized denoising loss to achieve flexible and robust conditional generation.
arXiv Detail & Related papers (2024-10-17T17:34:06Z) - Adaptive Frequency Filters As Efficient Global Token Mixers [100.27957692579892]
We show that adaptive frequency filters can serve as efficient global token mixers.
We take AFF token mixers as primary neural operators to build a lightweight neural network, dubbed AFFNet.
arXiv Detail & Related papers (2023-07-26T07:42:28Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - FAMLP: A Frequency-Aware MLP-Like Architecture For Domain Generalization [73.41395947275473]
We propose a novel frequency-aware architecture, in which the domain-specific features are filtered out in the transformed frequency domain.
Experiments on three benchmarks demonstrate significant performance, outperforming the state-of-the-art methods by a margin of 3%, 4% and 9%, respectively.
arXiv Detail & Related papers (2022-03-24T07:26:29Z) - Deep Frequency Filtering for Domain Generalization [55.66498461438285]
Deep Neural Networks (DNNs) have preferences for some frequency components in the learning process.
We propose Deep Frequency Filtering (DFF) for learning domain-generalizable features.
We show that applying our proposed DFF on a plain baseline outperforms the state-of-the-art methods on different domain generalization tasks.
arXiv Detail & Related papers (2022-03-23T05:19:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.