PhotoFourier: A Photonic Joint Transform Correlator-Based Neural Network
Accelerator
- URL: http://arxiv.org/abs/2211.05276v1
- Date: Thu, 10 Nov 2022 00:48:36 GMT
- Title: PhotoFourier: A Photonic Joint Transform Correlator-Based Neural Network
Accelerator
- Authors: Shurui Li, Hangbo Yang, Chee Wei Wong, Volker J. Sorger, Puneet Gupta
- Abstract summary: Integrated photonics has the potential to dramatically accelerate neural networks because of its low-latency nature.
PhotoFourier accelerator achieves more than 28X better energy-delay product compared to state-of-art photonic neural network accelerators.
- Score: 2.1372541869293555
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The last few years have seen a lot of work to address the challenge of
low-latency and high-throughput convolutional neural network inference.
Integrated photonics has the potential to dramatically accelerate neural
networks because of its low-latency nature. Combined with the concept of Joint
Transform Correlator (JTC), the computationally expensive convolution functions
can be computed instantaneously (time of flight of light) with almost no cost.
This 'free' convolution computation provides the theoretical basis of the
proposed PhotoFourier JTC-based CNN accelerator. PhotoFourier addresses a
myriad of challenges posed by on-chip photonic computing in the Fourier domain
including 1D lenses and high-cost optoelectronic conversions. The proposed
PhotoFourier accelerator achieves more than 28X better energy-delay product
compared to state-of-art photonic neural network accelerators.
Related papers
- Training Hybrid Neural Networks with Multimode Optical Nonlinearities Using Digital Twins [2.8479179029634984]
We introduce ultrashort pulse propagation in multimode fibers, which perform large-scale nonlinear transformations.
Training the hybrid architecture is achieved through a neural model that differentiably approximates the optical system.
Our experimental results achieve state-of-the-art image classification accuracies and simulation fidelity.
arXiv Detail & Related papers (2025-01-14T10:35:18Z) - Optical training of large-scale Transformers and deep neural networks with direct feedback alignment [48.90869997343841]
We experimentally implement a versatile and scalable training algorithm, called direct feedback alignment, on a hybrid electronic-photonic platform.
An optical processing unit performs large-scale random matrix multiplications, which is the central operation of this algorithm, at speeds up to 1500 TeraOps.
We study the compute scaling of our hybrid optical approach, and demonstrate a potential advantage for ultra-deep and wide neural networks.
arXiv Detail & Related papers (2024-09-01T12:48:47Z) - Hybrid Quantum-Classical Photonic Neural Networks [0.0]
We show a combination of classical network layers with trainable continuous variable quantum circuits.
On a classification task, hybrid networks achieve the same performance when benchmarked against fully classical networks that are twice the size.
arXiv Detail & Related papers (2024-07-02T15:31:38Z) - Free-Space Optical Spiking Neural Network [0.0]
We introduce the Free-space Optical deep Spiking Convolutional Neural Network (OSCNN)
This novel approach draws inspiration from computational models of the human eye.
Our results demonstrate promising performance with minimal latency and power consumption compared to their electronic ONN counterparts.
arXiv Detail & Related papers (2023-11-08T09:41:14Z) - Spatially Varying Nanophotonic Neural Networks [39.1303097259564]
Photonic processors that execute operations using photons instead of electrons promise to enable optical neural networks with ultra-low latency and power consumption.
Existing optical neural networks, limited by the underlying network designs, have achieved image recognition accuracy far below that of state-of-the-art electronic neural networks.
arXiv Detail & Related papers (2023-08-07T08:48:46Z) - All-optical graph representation learning using integrated diffractive
photonic computing units [51.15389025760809]
Photonic neural networks perform brain-inspired computations using photons instead of electrons.
We propose an all-optical graph representation learning architecture, termed diffractive graph neural network (DGNN)
We demonstrate the use of DGNN extracted features for node and graph-level classification tasks with benchmark databases and achieve superior performance.
arXiv Detail & Related papers (2022-04-23T02:29:48Z) - A new concept for design of photonic integrated circuits with the
ultimate density and low loss [62.997667081978825]
We propose a new concept for design of PICs with the ultimate downscaling capability, the absence of geometric loss and a high-fidelity throughput.
This is achieved by a periodic continuous-time quantum walk of photons through waveguide arrays.
We demonstrate the potential of the new concept by reconsidering the design of basic building blocks of the information and sensing systems.
arXiv Detail & Related papers (2021-08-02T14:23:18Z) - Photonic neural field on a silicon chip: large-scale, high-speed
neuro-inspired computing and sensing [0.0]
Photonic neural networks have significant potential for high-speed neural processing with low latency and ultralow energy consumption.
We propose the concept of a photonic neural field and implement it experimentally on a silicon chip to realize highly scalable neuro-inspired computing.
In this study, we use the on-chip photonic neural field as a reservoir of information and demonstrate a high-speed chaotic time-series prediction with low errors.
arXiv Detail & Related papers (2021-05-22T09:28:51Z) - Interleaving: Modular architectures for fault-tolerant photonic quantum
computing [50.591267188664666]
Photonic fusion-based quantum computing (FBQC) uses low-loss photonic delays.
We present a modular architecture for FBQC in which these components are combined to form "interleaving modules"
Exploiting the multiplicative power of delays, each module can add thousands of physical qubits to the computational Hilbert space.
arXiv Detail & Related papers (2021-03-15T18:00:06Z) - Rapid characterisation of linear-optical networks via PhaseLift [51.03305009278831]
Integrated photonics offers great phase-stability and can rely on the large scale manufacturability provided by the semiconductor industry.
New devices, based on such optical circuits, hold the promise of faster and energy-efficient computations in machine learning applications.
We present a novel technique to reconstruct the transfer matrix of linear optical networks.
arXiv Detail & Related papers (2020-10-01T16:04:22Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.