Neighborhood and Graph Constructions using Non-Negative Kernel
Regression
- URL: http://arxiv.org/abs/1910.09383v4
- Date: Sun, 16 Apr 2023 04:57:36 GMT
- Title: Neighborhood and Graph Constructions using Non-Negative Kernel
Regression
- Authors: Sarath Shekkizhar and Antonio Ortega
- Abstract summary: We present an alternative view of neighborhood selection, where we show that neighborhood construction is equivalent to a sparse signal approximation problem.
We also propose an algorithm, non-negative kernel regression(NNK), for obtaining neighborhoods that lead to better sparse representation.
- Score: 42.16401154367232
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data-driven neighborhood definitions and graph constructions are often used
in machine learning and signal processing applications. k-nearest
neighbor~(kNN) and $\epsilon$-neighborhood methods are among the most common
methods used for neighborhood selection, due to their computational simplicity.
However, the choice of parameters associated with these methods, such as k and
$\epsilon$, is still ad hoc. We make two main contributions in this paper.
First, we present an alternative view of neighborhood selection, where we show
that neighborhood construction is equivalent to a sparse signal approximation
problem. Second, we propose an algorithm, non-negative kernel regression~(NNK),
for obtaining neighborhoods that lead to better sparse representation. NNK
draws similarities to the orthogonal matching pursuit approach to signal
representation and possesses desirable geometric and theoretical properties.
Experiments demonstrate (i) the robustness of the NNK algorithm for
neighborhood and graph construction, (ii) its ability to adapt the number of
neighbors to the data properties, and (iii) its superior performance in local
neighborhood and graph-based machine learning tasks.
Related papers
- Adaptive $k$-nearest neighbor classifier based on the local estimation of the shape operator [49.87315310656657]
We introduce a new adaptive $k$-nearest neighbours ($kK$-NN) algorithm that explores the local curvature at a sample to adaptively defining the neighborhood size.
Results on many real-world datasets indicate that the new $kK$-NN algorithm yields superior balanced accuracy compared to the established $k$-NN method.
arXiv Detail & Related papers (2024-09-08T13:08:45Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - LoNe Sampler: Graph node embeddings by coordinated local neighborhood
sampling [0.7614628596146599]
Local graph neighborhood sampling is a fundamental computational problem that is at the heart of algorithms for node representation learning.
We present LoNe Sampler, a suite of algorithms for generating discrete node embeddings by Local Neighborhood Sampling.
arXiv Detail & Related papers (2022-11-28T08:04:26Z) - Beyond kNN: Adaptive, Sparse Neighborhood Graphs via Optimal Transport [0.1933681537640272]
Nearest neighbour graphs are widely used to capture the geometry or topology of a dataset.
One of the most common strategies to construct such a graph is based on selecting a fixed number k of nearest neighbours (kNN) for each point.
We propose a simple approach to construct an adaptive neighbourhood graph from a single parameter, based on quadratically regularised optimal transport.
arXiv Detail & Related papers (2022-08-01T04:24:58Z) - Active Nearest Neighbor Regression Through Delaunay Refinement [79.93030583257597]
We introduce an algorithm for active function approximation based on nearest neighbor regression.
Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value.
arXiv Detail & Related papers (2022-06-16T10:24:03Z) - Scalable Optimal Transport in High Dimensions for Graph Distances,
Embedding Alignment, and More [7.484063729015126]
We propose two effective log-linear time approximations of the cost matrix for optimal transport.
These approximations enable general log-linear time algorithms for entropy-regularized OT that perform well even for the complex, high-dimensional spaces.
For graph distance regression we propose the graph transport network (GTN), which combines graph neural networks (GNNs) with enhanced Sinkhorn.
arXiv Detail & Related papers (2021-07-14T17:40:08Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z) - Canny-VO: Visual Odometry with RGB-D Cameras based on Geometric 3D-2D
Edge Alignment [85.32080531133799]
This paper reviews the classical problem of free-form curve registration and applies it to an efficient RGBD visual odometry system called Canny-VO.
Two replacements for the distance transformation commonly used in edge registration are proposed: Approximate Nearest Neighbour Fields and Oriented Nearest Neighbour Fields.
3D2D edge alignment benefits from these alternative formulations in terms of both efficiency and accuracy.
arXiv Detail & Related papers (2020-12-15T11:42:17Z) - Learning Representations using Spectral-Biased Random Walks on Graphs [18.369974607582584]
We study how much a probabilistic bias in this process affects the quality of the nodes picked by the process.
We succinctly capture this neighborhood as a probability measure based on the spectrum of the node's neighborhood subgraph represented as a normalized laplacian matrix.
We empirically evaluate our approach against several state-of-the-art node embedding techniques on a wide variety of real-world datasets.
arXiv Detail & Related papers (2020-05-19T20:42:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.