Spatially Varying Nanophotonic Neural Networks
- URL: http://arxiv.org/abs/2308.03407v3
- Date: Sat, 30 Dec 2023 21:44:15 GMT
- Title: Spatially Varying Nanophotonic Neural Networks
- Authors: Kaixuan Wei, Xiao Li, Johannes Froech, Praneeth Chakravarthula, James
Whitehead, Ethan Tseng, Arka Majumdar, Felix Heide
- Abstract summary: Photonic processors that execute operations using photons instead of electrons promise to enable optical neural networks with ultra-low latency and power consumption.
Existing optical neural networks, limited by the underlying network designs, have achieved image recognition accuracy far below that of state-of-the-art electronic neural networks.
- Score: 39.1303097259564
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The explosive growth of computation and energy cost of artificial
intelligence has spurred strong interests in new computing modalities as
potential alternatives to conventional electronic processors. Photonic
processors that execute operations using photons instead of electrons, have
promised to enable optical neural networks with ultra-low latency and power
consumption. However, existing optical neural networks, limited by the
underlying network designs, have achieved image recognition accuracy far below
that of state-of-the-art electronic neural networks. In this work, we close
this gap by embedding massively parallelized optical computation into flat
camera optics that perform neural network computation during the capture,
before recording an image on the sensor. Specifically, we harness large kernels
and propose a large-kernel spatially-varying convolutional neural network
learned via low-dimensional reparameterization techniques. We experimentally
instantiate the network with a flat meta-optical system that encompasses an
array of nanophotonic structures designed to induce angle-dependent responses.
Combined with an extremely lightweight electronic backend with approximately 2K
parameters we demonstrate a reconfigurable nanophotonic neural network reaches
72.76\% blind test classification accuracy on CIFAR-10 dataset, and, as such,
the first time, an optical neural network outperforms the first modern digital
neural network -- AlexNet (72.64\%) with 57M parameters, bringing optical
neural network into modern deep learning era.
Related papers
- Optical training of large-scale Transformers and deep neural networks with direct feedback alignment [48.90869997343841]
We experimentally implement a versatile and scalable training algorithm, called direct feedback alignment, on a hybrid electronic-photonic platform.
An optical processing unit performs large-scale random matrix multiplications, which is the central operation of this algorithm, at speeds up to 1500 TeraOps.
We study the compute scaling of our hybrid optical approach, and demonstrate a potential advantage for ultra-deep and wide neural networks.
arXiv Detail & Related papers (2024-09-01T12:48:47Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Experimentally realized in situ backpropagation for deep learning in
nanophotonic neural networks [0.7627023515997987]
We design mass-manufacturable silicon photonic neural networks that cascade our custom designed "photonic mesh" accelerator.
We demonstrate in situ backpropagation for the first time to solve classification tasks.
Our findings suggest a new training paradigm for photonics-accelerated artificial intelligence based entirely on a physical analog of the popular backpropagation technique.
arXiv Detail & Related papers (2022-05-17T17:13:50Z) - A photonic chip-based machine learning approach for the prediction of
molecular properties [11.55177943027656]
Photonic chip technology offers an alternative platform for implementing neural network with faster data processing and lower energy usage.
We demonstrate the capability of photonic neural networks in predicting the quantum mechanical properties of molecules.
Our work opens the avenue for harnessing photonic technology for large-scale machine learning applications in molecular sciences.
arXiv Detail & Related papers (2022-03-03T03:15:14Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Photonic neural field on a silicon chip: large-scale, high-speed
neuro-inspired computing and sensing [0.0]
Photonic neural networks have significant potential for high-speed neural processing with low latency and ultralow energy consumption.
We propose the concept of a photonic neural field and implement it experimentally on a silicon chip to realize highly scalable neuro-inspired computing.
In this study, we use the on-chip photonic neural field as a reservoir of information and demonstrate a high-speed chaotic time-series prediction with low errors.
arXiv Detail & Related papers (2021-05-22T09:28:51Z) - An optical neural network using less than 1 photon per multiplication [4.003843776219224]
We experimentally demonstrate an optical neural network achieving 99% accuracy on handwritten-digit classification.
This performance was achieved using a custom free-space optical processor.
Our results provide a proof-of-principle for low-optical-power operation.
arXiv Detail & Related papers (2021-04-27T20:43:23Z) - 11 TeraFLOPs per second photonic convolutional accelerator for deep
learning optical neural networks [0.0]
We demonstrate a universal optical vector convolutional accelerator operating beyond 10 TeraFLOPS (floating point operations per second)
We then use the same hardware to sequentially form a deep optical CNN with ten output neurons, achieving successful recognition of full 10 digits with 900 pixel handwritten digit images with 88% accuracy.
This approach is scalable and trainable to much more complex networks for demanding applications such as unmanned vehicle and real-time video recognition.
arXiv Detail & Related papers (2020-11-14T21:24:01Z) - Photonics for artificial intelligence and neuromorphic computing [52.77024349608834]
Photonic integrated circuits have enabled ultrafast artificial neural networks.
Photonic neuromorphic systems offer sub-nanosecond latencies.
These systems could address the growing demand for machine learning and artificial intelligence.
arXiv Detail & Related papers (2020-10-30T21:41:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.