Applying language models to algebraic topology: generating simplicial
cycles using multi-labeling in Wu's formula
- URL: http://arxiv.org/abs/2306.16951v1
- Date: Thu, 1 Jun 2023 12:23:14 GMT
- Title: Applying language models to algebraic topology: generating simplicial
cycles using multi-labeling in Wu's formula
- Authors: Kirill Brilliantov, Fedor Pavutnitskiy, Dmitry Pasechnyuk, German
Magai
- Abstract summary: We take a step towards the goal of comprehending the group-theoretic structure of the generators of homotopy groups by leveraging the power of machine learning.
We present and evaluate language modelling approaches that employ multi-label information for input sequences, along with the necessary group-theoretic toolkit and non-neural baselines.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computing homotopy groups of spheres has long been a fundamental objective in
algebraic topology. Various theoretical and algorithmic approaches have been
developed to tackle this problem. In this paper we take a step towards the goal
of comprehending the group-theoretic structure of the generators of these
homotopy groups by leveraging the power of machine learning. Specifically, in
the simplicial group setting of Wu's formula, we reformulate the problem of
generating simplicial cycles as a problem of sampling from the intersection of
algorithmic datasets related to Dyck languages. We present and evaluate
language modelling approaches that employ multi-label information for input
sequences, along with the necessary group-theoretic toolkit and non-neural
baselines.
Related papers
- Interpetable Target-Feature Aggregation for Multi-Task Learning based on Bias-Variance Analysis [53.38518232934096]
Multi-task learning (MTL) is a powerful machine learning paradigm designed to leverage shared knowledge across tasks to improve generalization and performance.
We propose an MTL approach at the intersection between task clustering and feature transformation based on a two-phase iterative aggregation of targets and features.
In both phases, a key aspect is to preserve the interpretability of the reduced targets and features through the aggregation with the mean, which is motivated by applications to Earth science.
arXiv Detail & Related papers (2024-06-12T08:30:16Z) - Cluster-Algorithm-Amenable Models of Gauge Fields and Matter [0.0]
We focus on cluster algorithms which do not involve the determinant and involve a more physically relevant sampling of the configuration space.
We develop new cluster algorithms and design classes of models for fermions coupled to $mathZ$ and $U(1)$ fields that are amenable to being simulated by these cluster algorithms in a sign-problem free way.
arXiv Detail & Related papers (2023-12-28T07:21:52Z) - Learning to be Simple [0.0]
We employ machine learning to understand structured mathematical data involving finite groups.
We derive a theorem about necessary properties of generators of finite simple groups.
Our work highlights the possibility of generating new conjectures and theorems in mathematics with the aid of machine learning.
arXiv Detail & Related papers (2023-12-08T19:00:00Z) - Applications of Finite non-Abelian Simple Groups to Cryptography in the Quantum Era [0.0]
We review some applications of finite non-abelian simple groups to cryptography and discuss different scenarios in which this theory is clearly central.
We look at constructions based on various group-theoretic factorization problems, review group theoretical hash functions, and discuss fully homomorphic encryption using simple groups.
arXiv Detail & Related papers (2023-08-28T17:30:00Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - A Unified Approach to Synchronization Problems over Subgroups of the
Orthogonal Group [29.714239628405515]
We consider the class of synchronization problems which the group is a closed subgroup.
We propose a unified approach for solving this class of problems.
We show that our approach outperforms existing approaches.
arXiv Detail & Related papers (2020-09-16T07:25:50Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.