Hypergraph Optimization for Multi-structural Geometric Model Fitting
- URL: http://arxiv.org/abs/2002.05350v1
- Date: Thu, 13 Feb 2020 05:07:11 GMT
- Title: Hypergraph Optimization for Multi-structural Geometric Model Fitting
- Authors: Shuyuan Lin, Guobao Xiao, Yan Yan, David Suter, Hanzi Wang
- Abstract summary: We propose a novel hypergraph optimization based model fitting (HOMF) method to construct a simple but effective hypergraph.
The proposed method is highly efficient, and it can obtain accurate model fitting results within a few iterations.
- Score: 42.217640009137554
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, some hypergraph-based methods have been proposed to deal with the
problem of model fitting in computer vision, mainly due to the superior
capability of hypergraph to represent the complex relationship between data
points. However, a hypergraph becomes extremely complicated when the input data
include a large number of data points (usually contaminated with noises and
outliers), which will significantly increase the computational burden. In order
to overcome the above problem, we propose a novel hypergraph optimization based
model fitting (HOMF) method to construct a simple but effective hypergraph.
Specifically, HOMF includes two main parts: an adaptive inlier estimation
algorithm for vertex optimization and an iterative hyperedge optimization
algorithm for hyperedge optimization. The proposed method is highly efficient,
and it can obtain accurate model fitting results within a few iterations.
Moreover, HOMF can then directly apply spectral clustering, to achieve good
fitting performance. Extensive experimental results show that HOMF outperforms
several state-of-the-art model fitting methods on both synthetic data and real
images, especially in sampling efficiency and in handling data with severe
outliers.
Related papers
- Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - Learning the hub graphical Lasso model with the structured sparsity via
an efficient algorithm [1.0923877073891446]
We introduce a two-phase algorithm to estimate hub graphical models.
The proposed algorithm first generates a good initial point via a dual alternating direction method of multipliers.
It then warms a semismooth Newton (SSN) based augmented Lagrangian method (ALM) to compute a solution that is accurate enough for practical tasks.
arXiv Detail & Related papers (2023-08-17T08:24:28Z) - Improved Distribution Matching for Dataset Condensation [91.55972945798531]
We propose a novel dataset condensation method based on distribution matching.
Our simple yet effective method outperforms most previous optimization-oriented methods with much fewer computational resources.
arXiv Detail & Related papers (2023-07-19T04:07:33Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Condensing Graphs via One-Step Gradient Matching [50.07587238142548]
We propose a one-step gradient matching scheme, which performs gradient matching for only one single step without training the network weights.
Our theoretical analysis shows this strategy can generate synthetic graphs that lead to lower classification loss on real graphs.
In particular, we are able to reduce the dataset size by 90% while approximating up to 98% of the original performance.
arXiv Detail & Related papers (2022-06-15T18:20:01Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - HyperSF: Spectral Hypergraph Coarsening via Flow-based Local Clustering [9.438207505148947]
We propose an efficient spectral hypergraph coarsening scheme (HyperSF) for preserving the original spectral (structural) properties of hypergraphs.
Our results show that the proposed hypergraph coarsening algorithm can significantly improve the multi-way conductance of hypergraph clustering.
arXiv Detail & Related papers (2021-08-17T22:20:23Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-12T02:07:07Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-10T12:37:55Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.