Revisiting Tropical Polynomial Division: Theory, Algorithms and
Application to Neural Networks
- URL: http://arxiv.org/abs/2306.15157v1
- Date: Tue, 27 Jun 2023 02:26:07 GMT
- Title: Revisiting Tropical Polynomial Division: Theory, Algorithms and
Application to Neural Networks
- Authors: Ioannis Kordonis, Petros Maragos
- Abstract summary: Tropical geometry has recently found several applications in the analysis of neural networks with piecewise linear activation functions.
This paper presents a new look at the problem of tropical division and its application to the simplification of neural networks.
- Score: 40.137069931650444
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tropical geometry has recently found several applications in the analysis of
neural networks with piecewise linear activation functions. This paper presents
a new look at the problem of tropical polynomial division and its application
to the simplification of neural networks. We analyze tropical polynomials with
real coefficients, extending earlier ideas and methods developed for
polynomials with integer coefficients. We first prove the existence of a unique
quotient-remainder pair and characterize the quotient in terms of the convex
bi-conjugate of a related function. Interestingly, the quotient of tropical
polynomials with integer coefficients does not necessarily have integer
coefficients. Furthermore, we develop a relationship of tropical polynomial
division with the computation of the convex hull of unions of convex polyhedra
and use it to derive an exact algorithm for tropical polynomial division. An
approximate algorithm is also presented, based on an alternation between data
partition and linear programming. We also develop special techniques to divide
composite polynomials, described as sums or maxima of simpler ones. Finally, we
present some numerical results to illustrate the efficiency of the algorithms
proposed, using the MNIST handwritten digit and CIFAR-10 datasets.
Related papers
- Complementary polynomials in quantum signal processing [0.0]
To implement a given $P$, one must first construct a corresponding complementary $Q$.
Existing approaches to this problem employ numerical methods that are not amenable to explicit error analysis.
We present a new approach to complementarys using complex analysis.
arXiv Detail & Related papers (2024-06-06T16:47:11Z) - Exploring the Potential of Polynomial Basis Functions in Kolmogorov-Arnold Networks: A Comparative Study of Different Groups of Polynomials [0.0]
This paper presents a survey of 18 distincts and their potential applications in Gottliebmogorov Network (KAN) models.
The study aims to investigate the suitability of theseDistincts as basis functions in KAN models for complex tasks like handwritten digit classification.
The performance metrics of the KAN models, including overall accuracy, Kappa, and F1 score, are evaluated and compared.
arXiv Detail & Related papers (2024-05-30T20:40:16Z) - Algebraic Complexity and Neurovariety of Linear Convolutional Networks [0.0]
We study linear convolutional networks with one-dimensional and arbitrary strides.
We generate equations whose common zero locus corresponds to the Zariski closure of the corresponding neuromanifold.
Our findings reveal that the number of all complex critical points in the optimization of such a network is equal to the generic Euclidean distance of a Segre variety.
arXiv Detail & Related papers (2024-01-29T23:00:15Z) - Covering Number of Real Algebraic Varieties and Beyond: Improved Bounds and Applications [8.438718130535296]
We prove upper bounds on the covering number of sets in Euclidean space.
We show that bounds improve the best known general bound by Yomdin-Comte.
We illustrate the power of the result on three computational applications.
arXiv Detail & Related papers (2023-11-09T03:06:59Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Algorithmic Complexities in Backpropagation and Tropical Neural Networks [0.0]
We introduce artificial neural networks in terms of tropical arithmetics and tropical algebraic geometry.
We verify the algorithmic complexity is substantially lower than the usual backpropagation as the tropical arithmetic is free of the complexity of usual multiplication.
arXiv Detail & Related papers (2021-01-03T22:19:17Z) - Finite-Function-Encoding Quantum States [52.77024349608834]
We introduce finite-function-encoding (FFE) states which encode arbitrary $d$-valued logic functions.
We investigate some of their structural properties.
arXiv Detail & Related papers (2020-12-01T13:53:23Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - SURF: A Simple, Universal, Robust, Fast Distribution Learning Algorithm [64.13217062232874]
SURF is an algorithm for approximating distributions by piecewises.
It outperforms state-of-the-art algorithms in experiments.
arXiv Detail & Related papers (2020-02-22T01:03:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.