Fourier Learning Machines: Nonharmonic Fourier-Based Neural Networks for Scientific Machine Learning
- URL: http://arxiv.org/abs/2509.08759v1
- Date: Wed, 10 Sep 2025 16:49:20 GMT
- Title: Fourier Learning Machines: Nonharmonic Fourier-Based Neural Networks for Scientific Machine Learning
- Authors: Mominul Rubel, Adam Meyers, Gabriel Nicolosi,
- Abstract summary: We introduce the Fourier Learning Machine (FLM), a neural network (NN) architecture designed to represent a multidimensional nonharmonic Fourier series.<n>The FLM uses a simple feedforward structure with cosine activation functions to learn the frequencies, amplitudes, and phase shifts of the series as trainable parameters.<n> Computational experiments show that the performance of FLMs is comparable, and often superior, to that of established architectures like SIREN and vanilla feedforward NNs.
- Score: 0.5097809301149341
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce the Fourier Learning Machine (FLM), a neural network (NN) architecture designed to represent a multidimensional nonharmonic Fourier series. The FLM uses a simple feedforward structure with cosine activation functions to learn the frequencies, amplitudes, and phase shifts of the series as trainable parameters. This design allows the model to create a problem-specific spectral basis adaptable to both periodic and nonperiodic functions. Unlike previous Fourier-inspired NN models, the FLM is the first architecture able to represent a complete, separable Fourier basis in multiple dimensions using a standard Multilayer Perceptron-like architecture. A one-to-one correspondence between the Fourier coefficients and amplitudes and phase-shifts is demonstrated, allowing for the translation between a full, separable basis form and the cosine phase--shifted one. Additionally, we evaluate the performance of FLMs on several scientific computing problems, including benchmark Partial Differential Equations (PDEs) and a family of Optimal Control Problems (OCPs). Computational experiments show that the performance of FLMs is comparable, and often superior, to that of established architectures like SIREN and vanilla feedforward NNs.
Related papers
- FUTON: Fourier Tensor Network for Implicit Neural Representations [56.48739018255443]
Implicit neural representations (INRs) have emerged as powerful tools for encoding signals, yet dominant-based designs often suffer from slow convergence, overfitting to noise, and poor extrapolation.<n>We introduce FUTON, which models signals as generalized Fourier series whose coefficients are parameterized by a low-rank tensor decomposition.
arXiv Detail & Related papers (2026-02-13T19:31:44Z) - Fourier Fingerprints of Ansatzes in Quantum Machine Learning [0.0]
We show the existence of correlations between the Fourier modes, which depend on the structure of the circuit.<n>For several popular ansatzes, we compute the Fourier coefficient correlations (FCCs) and construct the Fourier fingerprint.<n>We show how, for the problem of learning random Fourier series, the FCC correctly predicts relative performance of ansatzes whilst the widely-used expressibility metric does not.
arXiv Detail & Related papers (2025-08-28T15:00:37Z) - Fourier Basis Mapping: A Time-Frequency Learning Framework for Time Series Forecasting [25.304812011127257]
We introduce a novel method for integrating time-frequency features through Fourier basis expansion and mapping in the time-frequency space.<n>Our approach extracts explicit frequency features while preserving temporal characteristics.<n>The results are validated on diverse real-world datasets for both long-term and short-term forecasting tasks.
arXiv Detail & Related papers (2025-07-13T01:45:27Z) - Neural Fourier Modelling: A Highly Compact Approach to Time-Series Analysis [9.969451740838418]
We introduce Neural Fourier Modelling (NFM), a compact yet powerful solution for time-series analysis.
NFM is grounded in two key properties of the Fourier transform (FT): (i) the ability to model finite-length time series as functions in the Fourier domain, and (ii) the capacity for data manipulation within the Fourier domain.
NFM achieves state-of-the-art performance on a wide range of tasks, including challenging time-series scenarios with previously unseen sampling rates at test time.
arXiv Detail & Related papers (2024-10-07T02:39:55Z) - Fourier Series Guided Design of Quantum Convolutional Neural Networks for Enhanced Time Series Forecasting [0.9060149007362646]
We apply 1D quantum convolution to address the task of time series forecasting.<n>By encoding multiple points into the quantum circuit to predict subsequent data, each point becomes a feature, transforming the problem into a multidimensional one.
arXiv Detail & Related papers (2024-04-23T01:35:07Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - Deep Fourier Up-Sampling [100.59885545206744]
Up-sampling in the Fourier domain is more challenging as it does not follow such a local property.
We propose a theoretically sound Deep Fourier Up-Sampling (FourierUp) to solve these issues.
arXiv Detail & Related papers (2022-10-11T06:17:31Z) - Functional Regularization for Reinforcement Learning via Learned Fourier
Features [98.90474131452588]
We propose a simple architecture for deep reinforcement learning by embedding inputs into a learned Fourier basis.
We show that it improves the sample efficiency of both state-based and image-based RL.
arXiv Detail & Related papers (2021-12-06T18:59:52Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases [73.53227696624306]
We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
arXiv Detail & Related papers (2020-10-01T14:31:59Z) - Fourier Neural Networks as Function Approximators and Differential
Equation Solvers [0.456877715768796]
The choice of activation and loss function yields results that replicate a Fourier series expansion closely.
We validate this FNN on naturally periodic smooth functions and on piecewise continuous periodic functions.
The main advantages of the current approach are the validity of the solution outside the training region, interpretability of the trained model, and simplicity of use.
arXiv Detail & Related papers (2020-05-27T00:30:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.