From Polynomials to Databases: Arithmetic Structures in Galois Theory
- URL: http://arxiv.org/abs/2511.16622v1
- Date: Thu, 20 Nov 2025 18:29:38 GMT
- Title: From Polynomials to Databases: Arithmetic Structures in Galois Theory
- Authors: Jurgen Mezinaj,
- Abstract summary: We develop a framework for classifying Galois groups of irreducible degree-7s over$mathbbQ$, combining explicit resolvent methods with machine learning techniques.<n>A database of over one million normalizedive project septics is constructed, each annotated with invariants$J_0, dots, J_4$ derived from binary transvections.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a computational framework for classifying Galois groups of irreducible degree-7 polynomials over~$\mathbb{Q}$, combining explicit resolvent methods with machine learning techniques. A database of over one million normalized projective septics is constructed, each annotated with algebraic invariants~$J_0, \dots, J_4$ derived from binary transvections. For each polynomial, we compute resolvent factorizations to determine its Galois group among the seven transitive subgroups of~$S_7$ identified by Foulkes. Using this dataset, we train a neurosymbolic classifier that integrates invariant-theoretic features with supervised learning, yielding improved accuracy in detecting rare solvable groups compared to coefficient-based models. The resulting database provides a reproducible resource for constructive Galois theory and supports empirical investigations into group distribution under height constraints. The methodology extends to higher-degree cases and illustrates the utility of hybrid symbolic-numeric techniques in computational algebra.
Related papers
- Neuro-Symbolic Learning for Galois Groups: Unveiling Probabilistic Trends in Polynomials [0.0]
This paper presents a neurosymbolic approach to classifying Galois groups of irreducibles.<n>By combining neural networks with symbolic reasoning we develop a model that outperforms purely numerical methods in accuracy and interpretability.<n>This work paves the way for future research in computational algebra, with implications for conjectures and higher degree classifications.
arXiv Detail & Related papers (2025-02-28T08:42:57Z) - Galois groups of polynomials and neurosymbolic networks [0.0]
This paper introduces a novel approach to understanding Galois theory, one of the foundational areas of algebra, through the lens of machine learning.<n>By analyzing equations with machine learning techniques, we aim to streamline the process of determining solvability by radicals and explore broader applications within Galois theory.
arXiv Detail & Related papers (2025-01-22T16:05:59Z) - Accelerated Discovery of Machine-Learned Symmetries: Deriving the
Exceptional Lie Groups G2, F4 and E6 [55.41644538483948]
This letter introduces two improved algorithms that significantly speed up the discovery of symmetry transformations.
Given the significant complexity of the exceptional Lie groups, our results demonstrate that this machine-learning method for discovering symmetries is completely general and can be applied to a wide variety of labeled datasets.
arXiv Detail & Related papers (2023-07-10T20:25:44Z) - Applying language models to algebraic topology: generating simplicial
cycles using multi-labeling in Wu's formula [0.0]
We take a step towards the goal of comprehending the group-theoretic structure of the generators of homotopy groups by leveraging the power of machine learning.
We present and evaluate language modelling approaches that employ multi-label information for input sequences, along with the necessary group-theoretic toolkit and non-neural baselines.
arXiv Detail & Related papers (2023-06-01T12:23:14Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Multiparameter Persistent Homology-Generic Structures and Quantum
Computing [0.0]
This article is an application of commutative algebra to the study of persistent homology in topological data analysis.
The generic structure of such resolutions and the classifying spaces are studied using results spanning several decades of research.
arXiv Detail & Related papers (2022-10-20T17:30:20Z) - Permutation Invariant Representations with Applications to Graph Deep
Learning [8.403227482145297]
This paper presents two Euclidean embeddings of quotient space generated by matrices that are identified modulo arbitrary row permutations.
An almost everywhere injective scheme can be implemented with minimal redundancy and low computational cost.
arXiv Detail & Related papers (2022-03-14T23:13:59Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.