On the Convergence of Tsetlin Machines for the XOR Operator
- URL: http://arxiv.org/abs/2101.02547v1
- Date: Thu, 7 Jan 2021 14:13:41 GMT
- Title: On the Convergence of Tsetlin Machines for the XOR Operator
- Authors: Lei Jiao, Xuan Zhang, Ole-Christoffer Granmo, K. Darshana Abeyrathna
- Abstract summary: The Tsetlin Machine (TM) is a novel machine learning algorithm with several distinct properties.
We analyze the convergence of the TM when input is non-linearly related to output by the XOR-operator.
- Score: 16.85020149902772
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Tsetlin Machine (TM) is a novel machine learning algorithm with several
distinct properties, including transparent inference and learning using
hardware-near building blocks. Although numerous papers explore the TM
empirically, many of its properties have not yet been analyzed mathematically.
In this article, we analyze the convergence of the TM when input is
non-linearly related to output by the XOR-operator. Our analysis reveals that
the TM, with just two conjunctive clauses, can converge almost surely to
reproducing XOR, learning from training data over an infinite time horizon.
Furthermore, the analysis shows how the hyper-parameter T guides clause
construction so that the clauses capture the distinct sub-patterns in the data.
Our analysis of convergence for XOR thus lays the foundation for analyzing
other more complex logical expressions. These analyses altogether, from a
mathematical perspective, provide new insights on why TMs have obtained
state-of-the-art performance on several pattern recognition problems
Related papers
- To CoT or not to CoT? Chain-of-thought helps mainly on math and symbolic reasoning [55.52872152909785]
Chain-of-thought (CoT) via prompting is the de facto method for eliciting reasoning capabilities from large language models (LLMs)
We show that CoT gives strong performance benefits primarily on tasks involving math or logic, with much smaller gains on other types of tasks.
arXiv Detail & Related papers (2024-09-18T17:55:00Z) - MuSR: Testing the Limits of Chain-of-thought with Multistep Soft Reasoning [63.80739044622555]
We introduce MuSR, a dataset for evaluating language models on soft reasoning tasks specified in a natural language narrative.
This dataset has two crucial features. First, it is created through a novel neurosymbolic synthetic-to-natural generation algorithm.
Second, our dataset instances are free text narratives corresponding to real-world domains of reasoning.
arXiv Detail & Related papers (2023-10-24T17:59:20Z) - TRIGO: Benchmarking Formal Mathematical Proof Reduction for Generative
Language Models [68.65075559137608]
We propose TRIGO, an ATP benchmark that not only requires a model to reduce a trigonometric expression with step-by-step proofs but also evaluates a generative LM's reasoning ability on formulas.
We gather trigonometric expressions and their reduced forms from the web, annotate the simplification process manually, and translate it into the Lean formal language system.
We develop an automatic generator based on Lean-Gym to create dataset splits of varying difficulties and distributions in order to thoroughly analyze the model's generalization ability.
arXiv Detail & Related papers (2023-10-16T08:42:39Z) - Generalized Convergence Analysis of Tsetlin Machines: A Probabilistic
Approach to Concept Learning [20.519261060300302]
This paper presents a comprehensive convergence analysis of Tsetlin automaton-based Machine Learning algorithms.
We introduce a novel framework, referred to as Probabilistic Concept Learning (PCL), which simplifies the TM structure.
We establish a theoretical proof confirming that, for any clause $C_k$, PCL converges to a conjunction of literals when $0.5p_k1$.
arXiv Detail & Related papers (2023-10-03T12:21:41Z) - On the Equivalence of the Weighted Tsetlin Machine and the Perceptron [12.48513712803069]
Tsetlin Machine (TM) has been gaining popularity as an inherently interpretable machine leaning method.
Although possessing favorable properties, TM has not been the go-to method for AI applications.
arXiv Detail & Related papers (2022-12-27T22:38:59Z) - Linear Temporal Logic Modulo Theories over Finite Traces (Extended
Version) [72.38188258853155]
This paper studies Linear Temporal Logic over Finite Traces (LTLf)
proposition letters are replaced with first-order formulas interpreted over arbitrary theories.
The resulting logic, called Satisfiability Modulo Theories (LTLfMT), is semi-decidable.
arXiv Detail & Related papers (2022-04-28T17:57:33Z) - On the Convergence of Tsetlin Machines for the AND and the OR Operators [18.937937844147644]
The Tsetlin Machine (TM) is a novel machine-learning algorithm based on propositional logic.
In this article, we analyze the convergence when input training samples follow AND and OR operators respectively.
arXiv Detail & Related papers (2021-09-17T13:53:26Z) - Quantum Algorithms for Data Representation and Analysis [68.754953879193]
We provide quantum procedures that speed-up the solution of eigenproblems for data representation in machine learning.
The power and practical use of these subroutines is shown through new quantum algorithms, sublinear in the input matrix's size, for principal component analysis, correspondence analysis, and latent semantic analysis.
Results show that the run-time parameters that do not depend on the input's size are reasonable and that the error on the computed model is small, allowing for competitive classification performances.
arXiv Detail & Related papers (2021-04-19T00:41:43Z) - Stochastic Approximation for Online Tensorial Independent Component
Analysis [98.34292831923335]
Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing.
In this paper, we present a by-product online tensorial algorithm that estimates for each independent component.
arXiv Detail & Related papers (2020-12-28T18:52:37Z) - On the Convergence of Tsetlin Machines for the IDENTITY- and NOT
Operators [14.186377732379045]
We analyze the convergence of the Tsetlin Machine with only one clause involved for classification.
Our analysis reveals that the TM, with just one clause, can converge correctly to the intended logical operator.
The analysis of the convergence of the two basic operators lays the foundation for analyzing other logical operators.
arXiv Detail & Related papers (2020-07-28T14:31:16Z) - Closed-Form Expressions for Global and Local Interpretation of Tsetlin
Machines with Applications to Explaining High-Dimensional Data [7.05622249909585]
We propose closed-form expressions for understanding why a TM model makes a specific prediction (local interpretability)
We also introduce expressions for measuring the importance of feature value ranges for continuous features.
For both classification and regression, our evaluation show correspondence with SHAP as well as competitive prediction accuracy in comparison with XGBoost, Explainable Boosting Machines, and Neural Additive Models.
arXiv Detail & Related papers (2020-07-27T21:47:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.