Instantiations and Computational Aspects of Non-Flat Assumption-based Argumentation
- URL: http://arxiv.org/abs/2404.11431v2
- Date: Fri, 24 May 2024 13:42:44 GMT
- Title: Instantiations and Computational Aspects of Non-Flat Assumption-based Argumentation
- Authors: Tuomo Lehtonen, Anna Rapberger, Francesca Toni, Markus Ulbricht, Johannes P. Wallner,
- Abstract summary: We study an instantiation-based approach for reasoning in possibly non-flat ABA.
We propose two algorithmic approaches for reasoning in possibly non-flat ABA.
- Score: 18.32141673219938
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Most existing computational tools for assumption-based argumentation (ABA) focus on so-called flat frameworks, disregarding the more general case. In this paper, we study an instantiation-based approach for reasoning in possibly non-flat ABA. We make use of a semantics-preserving translation between ABA and bipolar argumentation frameworks (BAFs). By utilizing compilability theory, we establish that the constructed BAFs will in general be of exponential size. In order to keep the number of arguments and computational cost low, we present three ways of identifying redundant arguments. Moreover, we identify fragments of ABA which admit a poly-sized instantiation. We propose two algorithmic approaches for reasoning in possibly non-flat ABA. The first approach utilizes the BAF instantiation while the second works directly without constructing arguments. An empirical evaluation shows that the former outperforms the latter on many instances, reflecting the lower complexity of BAF reasoning. This result is in contrast to flat ABA, where direct approaches dominate instantiation-based approaches.
Related papers
- A Methodology for Gradual Semantics for Structured Argumentation under Incomplete Information [15.717458041314194]
We provide a novel methodology for obtaining gradual semantics for structured argumentation frameworks.
Our methodology accommodates incomplete information about arguments' premises.
We demonstrate the potential of our approach by introducing two different instantiations of the methodology.
arXiv Detail & Related papers (2024-10-29T16:38:35Z) - A Comparative Study on Reasoning Patterns of OpenAI's o1 Model [69.08287909042421]
We show that OpenAI's o1 model has achieved the best performance on most datasets.
We also provide a detailed analysis on several reasoning benchmarks.
arXiv Detail & Related papers (2024-10-17T15:09:03Z) - On the Correspondence of Non-flat Assumption-based Argumentation and Logic Programming with Negation as Failure in the Head [20.981256612743145]
We show a correspondence between non-flat ABA and LPs with negation as failure in their head.
We then extend this result to so-called set-stable ABA semantics, originally defined for the fragment of non-flat ABA called bipolar ABA.
We showcase how to define set-stable semantics for LPs with negation as failure in their head and show the correspondence to set-stable ABA semantics.
arXiv Detail & Related papers (2024-05-15T15:10:03Z) - Non-flat ABA is an Instance of Bipolar Argumentation [23.655909692988637]
Assumption-based Argumentation (ABA) is a well-known structured argumentation formalism.
A common restriction imposed on ABA frameworks (ABAFs) is that they are flat.
No translation exists from general, possibly non-flat ABAFs into any kind of abstract argumentation formalism.
arXiv Detail & Related papers (2023-05-21T13:18:08Z) - Argumentative Explanations for Pattern-Based Text Classifiers [15.81939090849456]
We focus on explanations for a specific interpretable model, namely pattern-based logistic regression (PLR) for binary text classification.
We propose AXPLR, a novel explanation method using (forms of) computational argumentation to generate explanations.
arXiv Detail & Related papers (2022-05-22T21:16:49Z) - Harnessing Incremental Answer Set Solving for Reasoning in
Assumption-Based Argumentation [1.5469452301122177]
Assumption-based argumentation (ABA) is a central structured argumentation formalism.
Recent advances in answer set programming (ASP) enable efficiently solving NP-hard reasoning tasks of ABA in practice.
arXiv Detail & Related papers (2021-08-09T17:34:05Z) - Scalable Personalised Item Ranking through Parametric Density Estimation [53.44830012414444]
Learning from implicit feedback is challenging because of the difficult nature of the one-class problem.
Most conventional methods use a pairwise ranking approach and negative samplers to cope with the one-class problem.
We propose a learning-to-rank approach, which achieves convergence speed comparable to the pointwise counterpart.
arXiv Detail & Related papers (2021-05-11T03:38:16Z) - Approximation Algorithms for Sparse Principal Component Analysis [57.5357874512594]
Principal component analysis (PCA) is a widely used dimension reduction technique in machine learning and statistics.
Various approaches to obtain sparse principal direction loadings have been proposed, which are termed Sparse Principal Component Analysis.
We present thresholding as a provably accurate, time, approximation algorithm for the SPCA problem.
arXiv Detail & Related papers (2020-06-23T04:25:36Z) - A Generic First-Order Algorithmic Framework for Bi-Level Programming
Beyond Lower-Level Singleton [49.23948907229656]
Bi-level Descent Aggregation is a flexible and modularized algorithmic framework for generic bi-level optimization.
We derive a new methodology to prove the convergence of BDA without the LLS condition.
Our investigations also demonstrate that BDA is indeed compatible to a verify of particular first-order computation modules.
arXiv Detail & Related papers (2020-06-07T05:18:50Z) - Lower bounds in multiple testing: A framework based on derandomized
proxies [107.69746750639584]
This paper introduces an analysis strategy based on derandomization, illustrated by applications to various concrete models.
We provide numerical simulations of some of these lower bounds, and show a close relation to the actual performance of the Benjamini-Hochberg (BH) algorithm.
arXiv Detail & Related papers (2020-05-07T19:59:51Z) - On the Convergence Rate of Projected Gradient Descent for a
Back-Projection based Objective [58.33065918353532]
We consider a back-projection based fidelity term as an alternative to the common least squares (LS)
We show that using the BP term, rather than the LS term, requires fewer iterations of optimization algorithms.
arXiv Detail & Related papers (2020-05-03T00:58:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.