Learning polytopes with fixed facet directions
- URL: http://arxiv.org/abs/2201.03419v1
- Date: Mon, 10 Jan 2022 16:05:44 GMT
- Title: Learning polytopes with fixed facet directions
- Authors: Maria Dostert and Katharina Jochemko
- Abstract summary: We show that for fixed simplicial normal fan the least-squares estimate is given by a convex quadratic program.
We provide an algorithm that converges to the unknown shape as the number of noisy support function evaluations increases.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the task of reconstructing polytopes with fixed facet directions
from finitely many support function evaluations. We show that for fixed
simplicial normal fan the least-squares estimate is given by a convex quadratic
program. We study the geometry of the solution set and give a combinatorial
characterization for the uniqueness of the reconstruction in this case. We
provide an algorithm that, under mild assumptions, converges to the unknown
input shape as the number of noisy support function evaluations increases. We
also discuss limitations of our results if the restriction on the normal fan is
removed.
Related papers
- Finite-Sample Inference for Sparsely Permuted Linear Regression [6.2000582635449994]
We develop a general statistical inference framework on the permutation and regression coefficients.<n>For computational purposes, we develop a linear assignment problem computable in time and demonstrate that, with high probability, the solution is equivalent to that of the conventional least squares with large computational cost.
arXiv Detail & Related papers (2026-01-21T11:00:47Z) - An Unconditional Representation of the Conditional Score in Infinite-Dimensional Linear Inverse Problems [5.340736751238338]
We propose an unconditional representation of the conditional score-function tailored to linear inverse problems.<n>We show that the conditional score can be derived exactly from a trained (unconditional) score using affine transformations.<n>Our approach is formulated in infinite-dimensional function spaces, making it inherently discretization-invariant.
arXiv Detail & Related papers (2024-05-24T15:33:27Z) - Convergence of Some Convex Message Passing Algorithms to a Fixed Point [0.0]
A popular approach to the MAP inference problem in graphical models is to minimize an upper bound obtained from a dual linear programming or Lagrangian relaxation by (block-coordinate) descent.
This is also known as convex/convergent message passing; examples are max-sum diffusion and sequential tree-reweighted message passing (TRW-S)
We show that the algorithm terminates within $mathcalO (1/varepsilon)$ iterations.
arXiv Detail & Related papers (2024-03-07T13:14:21Z) - Covering Number of Real Algebraic Varieties and Beyond: Improved Bounds and Applications [8.438718130535296]
We prove upper bounds on the covering number of sets in Euclidean space.
We show that bounds improve the best known general bound by Yomdin-Comte.
We illustrate the power of the result on three computational applications.
arXiv Detail & Related papers (2023-11-09T03:06:59Z) - Bridging Discrete and Backpropagation: Straight-Through and Beyond [62.46558842476455]
We propose a novel approach to approximate the gradient of parameters involved in generating discrete latent variables.
We propose ReinMax, which achieves second-order accuracy by integrating Heun's method, a second-order numerical method for solving ODEs.
arXiv Detail & Related papers (2023-04-17T20:59:49Z) - GeoUDF: Surface Reconstruction from 3D Point Clouds via Geometry-guided
Distance Representation [73.77505964222632]
We present a learning-based method, namely GeoUDF, to tackle the problem of reconstructing a discrete surface from a sparse point cloud.
To be specific, we propose a geometry-guided learning method for UDF and its gradient estimation.
To extract triangle meshes from the predicted UDF, we propose a customized edge-based marching cube module.
arXiv Detail & Related papers (2022-11-30T06:02:01Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - A block-sparse Tensor Train Format for sample-efficient high-dimensional
Polynomial Regression [0.0]
Low-rank tensors are an established framework for high-dimensionals problems.
We propose to extend this framework by including the concept of block-sparsity.
This allows us to adapt the ansatz space to align better with known sample results.
arXiv Detail & Related papers (2021-04-29T10:57:53Z) - Root-finding Approaches for Computing Conformal Prediction Set [18.405645120971496]
Conformal prediction constructs a confidence region for an unobserved response of a feature vector based on previous identically distributed and exchangeable observations.
We exploit the fact that, emphoften, conformal prediction sets are intervals whose boundaries can be efficiently approximated by classical root-finding software.
arXiv Detail & Related papers (2021-04-14T06:41:12Z) - Reconstruction of Voxels with Position- and Angle-Dependent Weightings [66.25540976151842]
We first formulate this reconstruction problem in terms of a system matrix and weighting part.
We compute the pseudoinverse and show that the solution is rank-deficient and hence very ill posed.
arXiv Detail & Related papers (2020-10-27T11:29:47Z) - Computational Barriers to Estimation from Low-Degree Polynomials [81.67886161671379]
We study the power of low-degrees for the task of detecting the presence of hidden structures.
For a large class of "signal plus noise" problems, we give a user-friendly lower bound for the best possible mean squared error achievable by any degree.
As applications, we give a tight characterization of the low-degree minimum mean squared error for the planted submatrix and planted dense subgraph problems.
arXiv Detail & Related papers (2020-08-05T17:52:10Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Exact MAP-Inference by Confining Combinatorial Search with LP Relaxation [19.660527989370646]
We propose a family of relaxations, which naturally define lower bounds for its optimum.
This family always contains a tight relaxation and we give an algorithm able to find it and therefore, solve the initial non-relaxed NP-hard problem.
The relaxations we consider decompose the original problem into two non-overlapping parts: an easy LP-tight part and a difficult one.
arXiv Detail & Related papers (2020-04-14T09:10:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.