Cluster Algebras: Network Science and Machine Learning
- URL: http://arxiv.org/abs/2203.13847v2
- Date: Fri, 23 Feb 2024 11:16:10 GMT
- Title: Cluster Algebras: Network Science and Machine Learning
- Authors: Pierre-Philippe Dechant, Yang-Hui He, Elli Heyes, Edward Hirst
- Abstract summary: Cluster algebras have recently become an important player in mathematics and physics.
We investigate them through the lens of modern data science, specifically with techniques from network science and machine learning.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cluster algebras have recently become an important player in mathematics and
physics. In this work, we investigate them through the lens of modern data
science, specifically with techniques from network science and machine
learning. Network analysis methods are applied to the exchange graphs for
cluster algebras of varying mutation types. The analysis indicates that when
the graphs are represented without identifying by permutation equivalence
between clusters an elegant symmetry emerges in the quiver exchange graph
embedding. The ratio between number of seeds and number of quivers associated
to this symmetry is computed for finite Dynkin type algebras up to rank 5, and
conjectured for higher ranks. Simple machine learning techniques successfully
learn to classify cluster algebras using the data of seeds. The learning
performance exceeds 0.9 accuracies between algebras of the same mutation type
and between types, as well as relative to artificially generated data.
Related papers
- Machines and Mathematical Mutations: Using GNNs to Characterize Quiver Mutation Classes [4.229995708813431]
We use graph neural networks and AI explainability techniques to discover mutation equivalence criteria for quivers of type $tildeD_n$.
We also show that our model captures structure within its hidden representation that allows us to reconstruct known criteria from type $D_n$.
arXiv Detail & Related papers (2024-11-12T01:09:41Z) - Shedding Light on Problems with Hyperbolic Graph Learning [2.3743504594834635]
Recent papers in the graph machine learning literature have introduced a number of approaches for hyperbolic representation learning.
We take a careful look at the field of hyperbolic graph representation learning as it stands today.
We find that a number of papers fail to diligently present baselines, make faulty modelling assumptions when constructing algorithms, and use misleading metrics to quantify geometry of graph datasets.
arXiv Detail & Related papers (2024-11-11T03:12:41Z) - Machine Learning Mutation-Acyclicity of Quivers [0.0]
This paper applies machine learning techniques to the study of quivers--a type of directed multigraph with significant relevance in algebra.
We focus on determining the mutation-acyclicity of a quiver on 4 vertices, a property that is pivotal since mutation-acyclicity is often a necessary condition for theorems involving path algebras and cluster algebras.
By neural networks (NNs) and support vector machines (SVMs), we accurately classify more general 4-x quivers as mutation-acyclic or non-mutation-acyclic.
arXiv Detail & Related papers (2024-11-06T19:08:30Z) - Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Generation is better than Modification: Combating High Class Homophily Variance in Graph Anomaly Detection [51.11833609431406]
Homophily distribution differences between different classes are significantly greater than those in homophilic and heterophilic graphs.
We introduce a new metric called Class Homophily Variance, which quantitatively describes this phenomenon.
To mitigate its impact, we propose a novel GNN model named Homophily Edge Generation Graph Neural Network (HedGe)
arXiv Detail & Related papers (2024-03-15T14:26:53Z) - Machine Learning Clifford invariants of ADE Coxeter elements [2.0269884338680866]
We perform exhaustive calculations of all Coxeter transformations for $A_8$, $D_8$ and $E_8$ for a choice of basis of simple roots.
This computational algebra paradigm generates a dataset that can then be mined using techniques from data science.
This paper is a pump-priming study in experimental mathematics using Clifford algebras.
arXiv Detail & Related papers (2023-09-29T18:00:01Z) - Unsupervised Learning of Invariance Transformations [105.54048699217668]
We develop an algorithmic framework for finding approximate graph automorphisms.
We discuss how this framework can be used to find approximate automorphisms in weighted graphs in general.
arXiv Detail & Related papers (2023-07-24T17:03:28Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Category-Learning with Context-Augmented Autoencoder [63.05016513788047]
Finding an interpretable non-redundant representation of real-world data is one of the key problems in Machine Learning.
We propose a novel method of using data augmentations when training autoencoders.
We train a Variational Autoencoder in such a way, that it makes transformation outcome predictable by auxiliary network.
arXiv Detail & Related papers (2020-10-10T14:04:44Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.