High-Fidelity Prediction of Perturbed Optical Fields using Fourier Feature Networks
- URL: http://arxiv.org/abs/2508.19751v2
- Date: Tue, 02 Sep 2025 09:02:37 GMT
- Title: High-Fidelity Prediction of Perturbed Optical Fields using Fourier Feature Networks
- Authors: Joshua R. Jandrell, Mitchell A. Cox,
- Abstract summary: We present a novel data-efficient machine learning framework that learns the perturbation-dependent transmission matrix of a multimode fibre.<n>On experimental data from a compressed fibre, our model predicts the output field with a 0.995 complex correlation to the ground truth.<n>This approach provides a general tool for modelling complex optical systems from sparse measurements.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Predicting the effects of physical perturbations on optical channels is critical for advanced photonic devices, but existing modelling techniques are often computationally intensive or require exhaustive characterisation. We present a novel data-efficient machine learning framework that learns the perturbation-dependent transmission matrix of a multimode fibre. To overcome the challenge of modelling the resulting highly oscillatory functions, we encode the perturbation into a Fourier Feature basis, enabling a compact multi-layer perceptron to learn the mapping with high fidelity. On experimental data from a compressed fibre, our model predicts the output field with a 0.995 complex correlation to the ground truth, improving accuracy by an order of magnitude over standard networks while using 85\% fewer parameters. This approach provides a general tool for modelling complex optical systems from sparse measurements.
Related papers
- F-Adapter: Frequency-Adaptive Parameter-Efficient Fine-Tuning in Scientific Machine Learning [28.598598268071587]
We conduct the first systematic study of PEFT for pre-trained Large Operator Models (LOMs)<n>We observe that the widely used Low-Rank Adaptation (LoRA) yields markedly poorer performance on LOMs than Adapter tuning.<n>Motivated by the robust empirical gains of adapters and by our theoretical characterization of PDE solutions as spectrally sparse, we introduce Frequency-Adaptive Adapter (F-Adapter)
arXiv Detail & Related papers (2025-09-27T08:05:28Z) - Near-optimal decomposition of unitary matrices using phase masks and the discrete Fourier transform [0.0]
We introduce a constructive decomposition of unitary matrices using a sequence of $2N+5$ phase masks interleaved with $2N+4$ discrete Fourier transform matrices.<n>This decomposition can be leveraged to design universal interferometers based on phase masks and multimode interference couplers.
arXiv Detail & Related papers (2025-08-27T16:13:34Z) - Noise-tolerant tomography of multimode linear optical interferometers with single photons [0.0]
We present the method for reconstructing the transfer matrix of a linear optical interferometer.<n>Our approach accounts for losses and photon indistinguishability, making it robust to experimental imperfections.<n>The results show high fidelity in matrix reconstruction and successful application in boson sampling experiments.
arXiv Detail & Related papers (2025-06-25T14:36:38Z) - LSCD: Lomb-Scargle Conditioned Diffusion for Time series Imputation [55.800319453296886]
Time series with missing or irregularly sampled data are a persistent challenge in machine learning.<n>We introduce a different Lombiable--Scargle layer that enables a reliable computation of the power spectrum of irregularly sampled data.
arXiv Detail & Related papers (2025-06-20T14:48:42Z) - Robustifying Fourier Features Embeddings for Implicit Neural Representations [25.725097757343367]
Implicit Neural Representations (INRs) employ neural networks to represent continuous functions by mapping coordinates to the corresponding values of the target function.<n>INRs face a challenge known as spectral bias when dealing with scenes containing varying frequencies.<n>We propose the use of multi-layer perceptrons (MLPs) without additive.
arXiv Detail & Related papers (2025-02-08T07:43:37Z) - Coordinate-based Neural Network for Fourier Phase Retrieval [8.827173113748703]
Single impliCit neurAl Network (SCAN) is a tool built upon coordinate neural networks meticulously designed for enhanced phase retrieval performance.
SCAN adeptly connects object coordinates to their amplitude and phase within a unified network in an unsupervised manner.
arXiv Detail & Related papers (2023-11-25T04:23:23Z) - ESSAformer: Efficient Transformer for Hyperspectral Image
Super-resolution [76.7408734079706]
Single hyperspectral image super-resolution (single-HSI-SR) aims to restore a high-resolution hyperspectral image from a low-resolution observation.
We propose ESSAformer, an ESSA attention-embedded Transformer network for single-HSI-SR with an iterative refining structure.
arXiv Detail & Related papers (2023-07-26T07:45:14Z) - Physics-Driven Turbulence Image Restoration with Stochastic Refinement [80.79900297089176]
Image distortion by atmospheric turbulence is a critical problem in long-range optical imaging systems.
Fast and physics-grounded simulation tools have been introduced to help the deep-learning models adapt to real-world turbulence conditions.
This paper proposes the Physics-integrated Restoration Network (PiRN) to help the network to disentangle theity from the degradation and the underlying image.
arXiv Detail & Related papers (2023-07-20T05:49:21Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Understanding the Spectral Bias of Coordinate Based MLPs Via Training
Dynamics [2.9443230571766854]
We study the connection between the computations of ReLU networks, and the speed of gradient descent convergence.
We then use this formulation to study the severity of spectral bias in low dimensional settings, and how positional encoding overcomes this.
arXiv Detail & Related papers (2023-01-14T04:21:25Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - Data-driven Modeling of Mach-Zehnder Interferometer-based Optical Matrix
Multipliers [0.0]
Photonic integrated circuits are facilitating the development of optical neural networks.
We describe both simple analytical models and data-driven models for offline training of optical matrix multipliers.
The neural network-based models outperform the simple physics-based models in terms of prediction error.
arXiv Detail & Related papers (2022-10-17T15:19:26Z) - Low-power multi-mode fiber projector overcomes shallow neural networks classifiers [10.161703420607552]
Multi-mode optical fibers stand out as cost-effective and easy-to-handle tools.<n>We cast these fibers into random hardware projectors, transforming an input dataset into a higher dimensional speckled image set.<n>We found that the classification accuracy achieved is higher than that obtained with the standard transmission matrix model.
arXiv Detail & Related papers (2022-10-10T14:55:02Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Large-Scale Learning with Fourier Features and Tensor Decompositions [3.6930948691311007]
We exploit the tensor product structure of deterministic Fourier features, which enables us to represent the model parameters as a low-rank tensor decomposition.
We demonstrate by means of numerical experiments how our low-rank tensor approach obtains the same performance of the corresponding nonparametric model.
arXiv Detail & Related papers (2021-09-03T14:12:53Z) - Regularization by Denoising Sub-sampled Newton Method for Spectral CT
Multi-Material Decomposition [78.37855832568569]
We propose to solve a model-based maximum-a-posterior problem to reconstruct multi-materials images with application to spectral CT.
In particular, we propose to solve a regularized optimization problem based on a plug-in image-denoising function.
We show numerical and experimental results for spectral CT materials decomposition.
arXiv Detail & Related papers (2021-03-25T15:20:10Z) - Fourier Features Let Networks Learn High Frequency Functions in Low
Dimensional Domains [69.62456877209304]
We show that passing input points through a simple Fourier feature mapping enables a multilayer perceptron to learn high-frequency functions.
Results shed light on advances in computer vision and graphics that achieve state-of-the-art results.
arXiv Detail & Related papers (2020-06-18T17:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.