Massively parallel and universal approximation of nonlinear functions using diffractive processors
- URL: http://arxiv.org/abs/2507.08253v1
- Date: Fri, 11 Jul 2025 01:54:10 GMT
- Title: Massively parallel and universal approximation of nonlinear functions using diffractive processors
- Authors: Md Sadman Sakib Rahman, Yuhang Li, Xilin Yang, Shiqi Chen, Aydogan Ozcan,
- Abstract summary: Large-scale nonlinear computation can be performed using linear optics through optimized diffractive processors composed of passive phase-only surfaces.<n>We numerically demonstrate the parallel computation of one million distinct nonlinear functions, accurately executed at wavelength-scale spatial density.<n>These results establish diffractive optical processors as a scalable platform for massively parallel universal nonlinear function approximation.
- Score: 17.16859564691328
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nonlinear computation is essential for a wide range of information processing tasks, yet implementing nonlinear functions using optical systems remains a challenge due to the weak and power-intensive nature of optical nonlinearities. Overcoming this limitation without relying on nonlinear optical materials could unlock unprecedented opportunities for ultrafast and parallel optical computing systems. Here, we demonstrate that large-scale nonlinear computation can be performed using linear optics through optimized diffractive processors composed of passive phase-only surfaces. In this framework, the input variables of nonlinear functions are encoded into the phase of an optical wavefront, e.g., via a spatial light modulator (SLM), and transformed by an optimized diffractive structure with spatially varying point-spread functions to yield output intensities that approximate a large set of unique nonlinear functions, all in parallel. We provide proof establishing that this architecture serves as a universal function approximator for an arbitrary set of bandlimited nonlinear functions, also covering multi-variate and complex-valued functions. We also numerically demonstrate the parallel computation of one million distinct nonlinear functions, accurately executed at wavelength-scale spatial density at the output of a diffractive optical processor. Furthermore, we experimentally validated this framework using in situ optical learning and approximated 35 unique nonlinear functions in a single shot using a compact setup consisting of an SLM and an image sensor. These results establish diffractive optical processors as a scalable platform for massively parallel universal nonlinear function approximation, paving the way for new capabilities in analog optical computing based on linear materials.
Related papers
- Efficient Training for Optical Computing [0.0]
We introduce a novel backpropagation algorithm that incorporates plane wave decomposition via the Fourier transform.<n>We demonstrate significant reduction in training time by exploiting the structured and sparse nature of diffractive systems in training and inference.
arXiv Detail & Related papers (2025-06-25T21:03:47Z) - Unwrapping photonic reservoirs: enhanced expressivity via random Fourier encoding over stretched domains [0.0]
Photonic Reservoir Computing (RC) systems leverage the complex propagation and nonlinear interaction of optical waves to perform information processing tasks.<n>We propose a novel scattering-assisted photonic reservoir encoding scheme where the input phase is deliberately wrapped multiple times.<n>We demonstrate that, rather than hindering nonlinear separability through loss of bijectivity, wrapping significantly improves the reservoir's prediction performance.
arXiv Detail & Related papers (2025-06-02T08:07:00Z) - Unconventional Computing based on Four Wave Mixing in Highly Nonlinear
Waveguides [0.0]
We numerically analyze a photonic unconventional accelerator based on the four-wave mixing effect in highly nonlinear waveguides.
By exploiting the rich Kerr-induced nonlinearities, multiple nonlinear transformations of an input signal can be generated and used for solving complex nonlinear tasks.
arXiv Detail & Related papers (2024-02-14T12:34:38Z) - Transformers Implement Functional Gradient Descent to Learn Non-Linear Functions In Context [44.949726166566236]
We show that (non-linear) Transformers naturally learn to implement gradient descent in function space.
We also show that the optimal choice of non-linear activation depends in a natural way on the class of functions that need to be learned.
arXiv Detail & Related papers (2023-12-11T17:05:25Z) - Complex-valued universal linear transformations and image encryption
using spatially incoherent diffractive networks [0.0]
As an optical processor, a Diffractive Deep Neural Network (D2NN) utilizes engineered diffractive surfaces designed through machine learning to perform all-optical information processing.
We show that a spatially incoherent diffractive visual processor can approximate any complex-valued linear transformation and be used for all-optical image encryption using incoherent illumination.
arXiv Detail & Related papers (2023-10-05T08:43:59Z) - Pessimistic Nonlinear Least-Squares Value Iteration for Offline Reinforcement Learning [53.97335841137496]
We propose an oracle-efficient algorithm, dubbed Pessimistic Least-Square Value Iteration (PNLSVI) for offline RL with non-linear function approximation.
Our algorithm enjoys a regret bound that has a tight dependency on the function class complexity and achieves minimax optimal instance-dependent regret when specialized to linear function approximation.
arXiv Detail & Related papers (2023-10-02T17:42:01Z) - Nonlinear optical encoding enabled by recurrent linear scattering [16.952531256252744]
We introduce a design that passively induce optical nonlinear random mapping with a continuous-wave laser at a low power.<n>We demonstrate that our design retains vital information even when the readout dimensionality is reduced.<n>This capability allows our optical platforms to offer efficient optical information processing solutions across applications.
arXiv Detail & Related papers (2023-07-17T15:15:47Z) - Exploring Linear Feature Disentanglement For Neural Networks [63.20827189693117]
Non-linear activation functions, e.g., Sigmoid, ReLU, and Tanh, have achieved great success in neural networks (NNs)
Due to the complex non-linear characteristic of samples, the objective of those activation functions is to project samples from their original feature space to a linear separable feature space.
This phenomenon ignites our interest in exploring whether all features need to be transformed by all non-linear functions in current typical NNs.
arXiv Detail & Related papers (2022-03-22T13:09:17Z) - Designing Kerr Interactions for Quantum Information Processing via
Counterrotating Terms of Asymmetric Josephson-Junction Loops [68.8204255655161]
static cavity nonlinearities typically limit the performance of bosonic quantum error-correcting codes.
Treating the nonlinearity as a perturbation, we derive effective Hamiltonians using the Schrieffer-Wolff transformation.
Results show that a cubic interaction allows to increase the effective rates of both linear and nonlinear operations.
arXiv Detail & Related papers (2021-07-14T15:11:05Z) - Linear embedding of nonlinear dynamical systems and prospects for
efficient quantum algorithms [74.17312533172291]
We describe a method for mapping any finite nonlinear dynamical system to an infinite linear dynamical system (embedding)
We then explore an approach for approximating the resulting infinite linear system with finite linear systems (truncation)
arXiv Detail & Related papers (2020-12-12T00:01:10Z) - Sparse Quantized Spectral Clustering [85.77233010209368]
We exploit tools from random matrix theory to make precise statements about how the eigenspectrum of a matrix changes under such nonlinear transformations.
We show that very little change occurs in the informative eigenstructure even under drastic sparsification/quantization.
arXiv Detail & Related papers (2020-10-03T15:58:07Z) - Rapid characterisation of linear-optical networks via PhaseLift [51.03305009278831]
Integrated photonics offers great phase-stability and can rely on the large scale manufacturability provided by the semiconductor industry.
New devices, based on such optical circuits, hold the promise of faster and energy-efficient computations in machine learning applications.
We present a novel technique to reconstruct the transfer matrix of linear optical networks.
arXiv Detail & Related papers (2020-10-01T16:04:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.