Extending the planar theory of anyons to quantum wire networks
- URL: http://arxiv.org/abs/2301.06590v2
- Date: Wed, 09 Oct 2024 11:38:21 GMT
- Title: Extending the planar theory of anyons to quantum wire networks
- Authors: Tomasz Maciazek, Mia Conlon, Gert Vercleyen, J. K. Slingerland,
- Abstract summary: We establish graph-braided anyon fusion models for general wire networks.
In particular, we prove that triconnected networks yield the same braiding exchange operators as the planar anyon models.
We conjecture that the graph-braided anyon fusion models will possess the (generalised) coherence property.
- Score: 0.0
- License:
- Abstract: The braiding of the worldlines of particles restricted to move on a network (graph) is governed by the graph braid group, which can be strikingly different from the standard braid group known from two-dimensional physics. It has been recently shown that imposing the compatibility of graph braiding with anyon fusion for anyons exchanging at a single wire junction leads to new types of anyon models with the braiding exchange operators stemming from solutions of certain generalised hexagon equations. In this work, we establish these graph-braided anyon fusion models for general wire networks. We show that the character of braiding strongly depends on the graph-theoretic connectivity of the given network. In particular, we prove that triconnected networks yield the same braiding exchange operators as the planar anyon models. In contrast, modular biconnected networks support independent braiding exchange operators in different modules. Consequently, such modular networks may lead to more efficient topological quantum computer circuits. Finally, we conjecture that the graph-braided anyon fusion models will possess the (generalised) coherence property where certain polygon equations determine the braiding exchange operators for an arbitrary number of anyons. We also extensively study solutions to these polygon equations for chosen low-rank multiplicity-free fusion rings, including the Ising theory, quantum double of Z2, and Tambara-Yamagami models. We find numerous solutions that do not appear in the planar theory of anyons.
Related papers
- Curve Your Attention: Mixed-Curvature Transformers for Graph
Representation Learning [77.1421343649344]
We propose a generalization of Transformers towards operating entirely on the product of constant curvature spaces.
We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges.
arXiv Detail & Related papers (2023-09-08T02:44:37Z) - Unwrapping All ReLU Networks [1.370633147306388]
Deep ReLU Networks can be decomposed into a collection of linear models.
We extend this decomposition to Graph Neural networks and tensor convolutional networks.
We show how this model leads to computing cheap and exact SHAP values.
arXiv Detail & Related papers (2023-05-16T13:30:15Z) - On Optimizing the Communication of Model Parallelism [74.15423270435949]
We study a novel and important communication pattern in large-scale model-parallel deep learning (DL)
In cross-mesh resharding, a sharded tensor needs to be sent from a source device mesh to a destination device mesh.
We propose two contributions to address cross-mesh resharding: an efficient broadcast-based communication system, and an "overlapping-friendly" pipeline schedule.
arXiv Detail & Related papers (2022-11-10T03:56:48Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Joint Network Topology Inference via a Shared Graphon Model [24.077455621015552]
We consider the problem of estimating the topology of multiple networks from nodal observations.
We adopt a graphon as our random graph model, which is a nonparametric model from which graphs of potentially different sizes can be drawn.
arXiv Detail & Related papers (2022-09-17T02:38:58Z) - Compatibility of Braiding and Fusion on Wire Networks [0.0]
Exchanging particles on graphs, or more concretely on networks of quantum wires, has been proposed as a means to perform fault tolerant quantum computation.
We find the usual planar anyons solutions but also more general braid actions.
We illustrate this with Abelian, Fibonacci and Ising fusion rules.
arXiv Detail & Related papers (2022-02-16T17:28:06Z) - Universal properties of anyon braiding on one-dimensional wire networks [0.0]
We show that anyons on wire networks have fundamentally different braiding properties than anyons in 2D.
The character of braiding depends on the topological invariant called the connectedness of the network.
arXiv Detail & Related papers (2020-07-02T15:42:09Z) - Geometric presentations of braid groups for particles on a graph [0.0]
We study geometric presentations of braid groups for particles constrained to move on a graph.
In particular, we show that for $3$-connected planar graphs such a quotient reconstructs the well-known planar braid group.
Our results are of particular relevance for the study of non-abelian anyons on networks showing new possibilities for non-abelian quantum statistics on graphs.
arXiv Detail & Related papers (2020-06-27T02:10:22Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z) - Neural Networks are Convex Regularizers: Exact Polynomial-time Convex
Optimization Formulations for Two-layer Networks [70.15611146583068]
We develop exact representations of training two-layer neural networks with rectified linear units (ReLUs)
Our theory utilizes semi-infinite duality and minimum norm regularization.
arXiv Detail & Related papers (2020-02-24T21:32:41Z) - Understanding Graph Neural Networks with Generalized Geometric
Scattering Transforms [67.88675386638043]
The scattering transform is a multilayered wavelet-based deep learning architecture that acts as a model of convolutional neural networks.
We introduce windowed and non-windowed geometric scattering transforms for graphs based upon a very general class of asymmetric wavelets.
We show that these asymmetric graph scattering transforms have many of the same theoretical guarantees as their symmetric counterparts.
arXiv Detail & Related papers (2019-11-14T17:23:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.