All-optical graph representation learning using integrated diffractive
photonic computing units
- URL: http://arxiv.org/abs/2204.10978v1
- Date: Sat, 23 Apr 2022 02:29:48 GMT
- Title: All-optical graph representation learning using integrated diffractive
photonic computing units
- Authors: Tao Yan, Rui Yang, Ziyang Zheng, Xing Lin, Hongkai Xiong, Qionghai Dai
- Abstract summary: Photonic neural networks perform brain-inspired computations using photons instead of electrons.
We propose an all-optical graph representation learning architecture, termed diffractive graph neural network (DGNN)
We demonstrate the use of DGNN extracted features for node and graph-level classification tasks with benchmark databases and achieve superior performance.
- Score: 51.15389025760809
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Photonic neural networks perform brain-inspired computations using photons
instead of electrons that can achieve substantially improved computing
performance. However, existing architectures can only handle data with regular
structures, e.g., images or videos, but fail to generalize to graph-structured
data beyond Euclidean space, e.g., social networks or document co-citation
networks. Here, we propose an all-optical graph representation learning
architecture, termed diffractive graph neural network (DGNN), based on the
integrated diffractive photonic computing units (DPUs) to address this
limitation. Specifically, DGNN optically encodes node attributes into strip
optical waveguides, which are transformed by DPUs and aggregated by on-chip
optical couplers to extract their feature representations. Each DPU comprises
successive passive layers of metalines to modulate the electromagnetic optical
field via diffraction, where the metaline structures are learnable parameters
shared across graph nodes. DGNN captures complex dependencies among the node
neighborhoods and eliminates the nonlinear transition functions during the
light-speed optical message passing over graph structures. We demonstrate the
use of DGNN extracted features for node and graph-level classification tasks
with benchmark databases and achieve superior performance. Our work opens up a
new direction for designing application-specific integrated photonic circuits
for high-efficiency processing of large-scale graph data structures using deep
learning.
Related papers
- LightDiC: A Simple yet Effective Approach for Large-scale Digraph
Representation Learning [42.72417353512392]
We propose LightDiC, a scalable variant of the digraph convolution based on the magnetic Laplacian.
LightDiC is the first DiGNN to provide satisfactory results in the most representative large-scale database.
arXiv Detail & Related papers (2024-01-22T09:09:10Z) - Optical Neural Ordinary Differential Equations [44.97261923694945]
We propose the optical neural ordinary differential equations (ON-ODE) architecture that parameterizes the continuous dynamics of hidden layers with optical ODE solvers.
The ON-ODE comprises the PNNs followed by the photonic integrator and optical feedback loop, which can be configured to represent residual neural networks (ResNet) and recurrent neural networks with effectively reduced chip area occupancy.
arXiv Detail & Related papers (2022-09-26T04:04:02Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Graph Convolutional Networks in Feature Space for Image Deblurring and
Super-resolution [11.531085904098003]
We propose a novel encoder-decoder network with added graph convolutions.
Experiments show it significantly boosts performance for image restoration tasks.
We believe it opens up opportunities for GCN-based approaches in more applications.
arXiv Detail & Related papers (2021-05-21T17:02:15Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.