Exact learning and test theory
- URL: http://arxiv.org/abs/2201.04506v1
- Date: Wed, 12 Jan 2022 15:10:22 GMT
- Title: Exact learning and test theory
- Authors: Mikhail Moshkov
- Abstract summary: We study arbitrary infinite binary information systems each of which consists of an infinite set of elements.
We consider the notion of a problem over information system, which is described by a finite number of attributes.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, based on results of exact learning and test theory, we study
arbitrary infinite binary information systems each of which consists of an
infinite set of elements and an infinite set of two-valued functions
(attributes) defined on the set of elements. We consider the notion of a
problem over information system, which is described by a finite number of
attributes: for a given element, we should recognize values of these
attributes. As algorithms for problem solving, we consider decision trees of
two types: (i) using only proper hypotheses (an analog of proper equivalence
queries from exact learning), and (ii) using both attributes and proper
hypotheses. As time complexity, we study the depth of decision trees. In the
worst case, with the growth of the number of attributes in the problem
description, the minimum depth of decision trees of both types either is
bounded from above by a constant or grows as a logarithm, or linearly. Based on
these results and results obtained earlier for attributes and arbitrary
hypotheses, we divide the set of all infinite binary information systems into
seven complexity classes.
Related papers
- Greedy Algorithm for Inference of Decision Trees from Decision Rule
Systems [0.0]
Decision trees and decision rule systems play important roles as attributes, knowledge representation tools, and algorithms.
In this paper, we consider the inverse transformation problem, which is not so simple.
Instead of constructing an entire decision tree, our study focuses on a greedy time algorithm that simulates the operation of a decision tree on a given attribute.
arXiv Detail & Related papers (2024-01-08T09:28:55Z) - The Computational Complexity of Concise Hypersphere Classification [49.57441416941195]
This paper is the first complexity-theoretic study of the hypersphere classification problem for binary data.
We use the fine-grained parameterized complexity paradigm to analyze the impact of structural properties that may be present in the input data.
arXiv Detail & Related papers (2023-12-12T09:33:03Z) - Probabilistic Tree-of-thought Reasoning for Answering
Knowledge-intensive Complex Questions [93.40614719648386]
Large language models (LLMs) are capable of answering knowledge-intensive complex questions with chain-of-thought (CoT) reasoning.
Recent works turn to retrieving external knowledge to augment CoT reasoning.
We propose a novel approach: Probabilistic Tree-of-thought Reasoning (ProbTree)
arXiv Detail & Related papers (2023-11-23T12:52:37Z) - Reasoning over Hierarchical Question Decomposition Tree for Explainable
Question Answering [83.74210749046551]
We propose to leverage question decomposing for heterogeneous knowledge integration.
We propose a novel two-stage XQA framework, Reasoning over Hierarchical Question Decomposition Tree (RoHT)
Experiments on complex QA datasets KQA Pro and Musique show that our framework outperforms SOTA methods significantly.
arXiv Detail & Related papers (2023-05-24T11:45:59Z) - Construction of Decision Trees and Acyclic Decision Graphs from Decision
Rule Systems [0.0]
We study the complexity of constructing decision trees and acyclic decision graphs representing decision trees from decision rule systems.
We discuss the possibility of not building the entire decision tree, but describing the computation path in this tree for the given input.
arXiv Detail & Related papers (2023-05-02T18:40:48Z) - Regularized impurity reduction: Accurate decision trees with complexity
guarantees [20.170305081348328]
We propose a tree-induction algorithm that gives a logarithmic approximation guarantee on the tree complexity.
The enhanced algorithms strike an excellent balance between predictive accuracy and tree complexity.
arXiv Detail & Related papers (2022-08-23T13:15:19Z) - Improved Quantum Query Upper Bounds Based on Classical Decision Trees [0.0]
Given a classical query algorithm as a decision tree, when does there exist a quantum query algorithm with a speed-up over the classical one?
We provide a general construction based on the structure of the underlying decision tree, and prove that this can give us an up-to-quadratic quantum speed-up.
arXiv Detail & Related papers (2022-03-06T14:04:06Z) - Exact learning for infinite families of concepts [0.0]
We study arbitrary infinite families of concepts each of which consists of an infinite set of elements and an infinite set of subsets of this set called concepts.
We consider the notion of a problem over a family of concepts that is described by a finite number of elements.
arXiv Detail & Related papers (2022-01-13T07:32:47Z) - The Causal Neural Connection: Expressiveness, Learnability, and
Inference [125.57815987218756]
An object called structural causal model (SCM) represents a collection of mechanisms and sources of random variation of the system under investigation.
In this paper, we show that the causal hierarchy theorem (Thm. 1, Bareinboim et al., 2020) still holds for neural models.
We introduce a special type of SCM called a neural causal model (NCM), and formalize a new type of inductive bias to encode structural constraints necessary for performing causal inferences.
arXiv Detail & Related papers (2021-07-02T01:55:18Z) - Foundations of Reasoning with Uncertainty via Real-valued Logics [70.43924776071616]
We give a sound and strongly complete axiomatization that can be parametrized to cover essentially every real-valued logic.
Our class of sentences are very rich, and each describes a set of possible real values for a collection of formulas of the real-valued logic.
arXiv Detail & Related papers (2020-08-06T02:13:11Z) - MurTree: Optimal Classification Trees via Dynamic Programming and Search [61.817059565926336]
We present a novel algorithm for learning optimal classification trees based on dynamic programming and search.
Our approach uses only a fraction of the time required by the state-of-the-art and can handle datasets with tens of thousands of instances.
arXiv Detail & Related papers (2020-07-24T17:06:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.