The Boundaries of Tractability in Hierarchical Task Network Planning
- URL: http://arxiv.org/abs/2401.14174v1
- Date: Thu, 25 Jan 2024 13:34:33 GMT
- Title: The Boundaries of Tractability in Hierarchical Task Network Planning
- Authors: Cornelius Brand, Robert Ganian, Fionn Mc Inerney, Simon Wietheger
- Abstract summary: We study the complexity-theoretic boundaries of tractability for three classical problems in the context of Hierarchical Task Network Planning.
We show that all three problems can be solved in time on primitive task networks of constant partial order width.
- Score: 15.926731632774768
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the complexity-theoretic boundaries of tractability for three
classical problems in the context of Hierarchical Task Network Planning: the
validation of a provided plan, whether an executable plan exists, and whether a
given state can be reached by some plan. We show that all three problems can be
solved in polynomial time on primitive task networks of constant partial order
width (and a generalization thereof), whereas for the latter two problems this
holds only under a provably necessary restriction to the state space. Next, we
obtain an algorithmic meta-theorem along with corresponding lower bounds to
identify tight conditions under which general polynomial-time solvability
results can be lifted from primitive to general task networks. Finally, we
enrich our investigation by analyzing the parameterized complexity of the three
considered problems, and show that (1) fixed-parameter tractability for all
three problems can be achieved by replacing the partial order width with the
vertex cover number of the network as the parameter, and (2) other classical
graph-theoretic parameters of the network (including treewidth, treedepth, and
the aforementioned partial order width) do not yield fixed-parameter
tractability for any of the three problems.
Related papers
- The Complexity of Optimizing Atomic Congestion [14.845310803203724]
Atomic congestion games are a classic topic in network design, routing, and algorithmic game theory.
We show that the problem remains highly intractable even on extremely simple networks.
We conclude by extending our analysis towards the (even more challenging) min-max variant of the problem.
arXiv Detail & Related papers (2023-12-15T21:31:30Z) - A Structural Complexity Analysis of Synchronous Dynamical Systems [33.788917274180484]
We study the three most notable problems in synchronous dynamic systems.
All three problems were known to be intractable in the classical sense.
We show that all three problems remain intractable even on instances of constant treewidth.
arXiv Detail & Related papers (2023-12-12T10:49:33Z) - What Planning Problems Can A Relational Neural Network Solve? [91.53684831950612]
We present a circuit complexity analysis for relational neural networks representing policies for planning problems.
We show that there are three general classes of planning problems, in terms of the growth of circuit width and depth.
We also illustrate the utility of this analysis for designing neural networks for policy learning.
arXiv Detail & Related papers (2023-12-06T18:47:28Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - The Sample Complexity of One-Hidden-Layer Neural Networks [57.6421258363243]
We study a class of scalar-valued one-hidden-layer networks, and inputs bounded in Euclidean norm.
We prove that controlling the spectral norm of the hidden layer weight matrix is insufficient to get uniform convergence guarantees.
We analyze two important settings where a mere spectral norm control turns out to be sufficient.
arXiv Detail & Related papers (2022-02-13T07:12:02Z) - Path Regularization: A Convexity and Sparsity Inducing Regularization
for Parallel ReLU Networks [75.33431791218302]
We study the training problem of deep neural networks and introduce an analytic approach to unveil hidden convexity in the optimization landscape.
We consider a deep parallel ReLU network architecture, which also includes standard deep networks and ResNets as its special cases.
arXiv Detail & Related papers (2021-10-18T18:00:36Z) - On the Expected Complexity of Maxout Networks [0.0]
Recent works have shown that the practical complexity of deep ReLU networks is often far from the theoretical maximum.
In this work we show that this phenomenon also occurs in networks with maxout (multi-argument) activation functions.
We also show that the parameter space has a multitude of full-dimensional regions with widely different complexity, and obtain nontrivial lower bounds on the expected complexity.
arXiv Detail & Related papers (2021-07-01T11:36:32Z) - Global Convergence of Three-layer Neural Networks in the Mean Field
Regime [3.553493344868413]
In the mean field regime, neural networks are appropriately scaled so that as the width tends to infinity, the learning dynamics tends to a nonlinear and nontrivial dynamical limit, known as the mean field limit.
Recent works have successfully applied such analysis to two-layer networks and provided global convergence guarantees.
We prove a global convergence result for unregularized feedforward three-layer networks in the mean field regime.
arXiv Detail & Related papers (2021-05-11T17:45:42Z) - Bounding the Complexity of Formally Verifying Neural Networks: A
Geometric Approach [1.9493449206135296]
We consider formally verifying the complexity of ReLU Neural Networks (NNs)
In this paper, we show that for two different NNs, the verification problem satisfies two different types of constraints.
Both algorithms efficiently translate the NN parameters into the effect of the NN's architecture by means of hyperplanes.
arXiv Detail & Related papers (2020-12-22T00:29:54Z) - An Integer Linear Programming Framework for Mining Constraints from Data [81.60135973848125]
We present a general framework for mining constraints from data.
In particular, we consider the inference in structured output prediction as an integer linear programming (ILP) problem.
We show that our approach can learn to solve 9x9 Sudoku puzzles and minimal spanning tree problems from examples without providing the underlying rules.
arXiv Detail & Related papers (2020-06-18T20:09:53Z) - Convex Geometry and Duality of Over-parameterized Neural Networks [70.15611146583068]
We develop a convex analytic approach to analyze finite width two-layer ReLU networks.
We show that an optimal solution to the regularized training problem can be characterized as extreme points of a convex set.
In higher dimensions, we show that the training problem can be cast as a finite dimensional convex problem with infinitely many constraints.
arXiv Detail & Related papers (2020-02-25T23:05:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.