Deterministic Iteratively Built KD-Tree with KNN Search for Exact
Applications
- URL: http://arxiv.org/abs/2106.03799v1
- Date: Mon, 7 Jun 2021 17:09:22 GMT
- Title: Deterministic Iteratively Built KD-Tree with KNN Search for Exact
Applications
- Authors: Aryan Naim, Joseph Bowkett, Sisir Karumanchi, Peyman Tavallali, Brett
Kennedy
- Abstract summary: K-Nearest Neighbors (KNN) search is a fundamental algorithm in artificial intelligence software with applications in robotics, and autonomous vehicles.
Similar to binary trees, kd-trees become unbalanced as new data is added in online applications which can lead to rapid degradation in search performance unless the tree is rebuilt.
We will present a "forest of interval kd-trees" which reduces the number of tree rebuilds, without compromising the exactness of query results.
- Score: 2.7325238096808318
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: K-Nearest Neighbors (KNN) search is a fundamental algorithm in artificial
intelligence software with applications in robotics, and autonomous vehicles.
These wide-ranging applications utilize KNN either directly for simple
classification or combine KNN results as input to other algorithms such as
Locally Weighted Learning (LWL). Similar to binary trees, kd-trees become
unbalanced as new data is added in online applications which can lead to rapid
degradation in search performance unless the tree is rebuilt. Although
approximate methods are suitable for graphics applications, which prioritize
query speed over query accuracy, they are unsuitable for certain applications
in autonomous systems, aeronautics, and robotic manipulation where exact
solutions are desired. In this paper, we will attempt to assess the performance
of non-recursive deterministic kd-tree functions and KNN functions. We will
also present a "forest of interval kd-trees" which reduces the number of tree
rebuilds, without compromising the exactness of query results.
Related papers
- LiteSearch: Efficacious Tree Search for LLM [70.29796112457662]
This study introduces a novel guided tree search algorithm with dynamic node selection and node-level exploration budget.
Experiments conducted on the GSM8K and TabMWP datasets demonstrate that our approach enjoys significantly lower computational costs compared to baseline methods.
arXiv Detail & Related papers (2024-06-29T05:14:04Z) - Archtree: on-the-fly tree-structured exploration for latency-aware
pruning of deep neural networks [20.564198591600647]
Archtree is a novel method for latency-driven structured pruning of deep neural networks (DNNs)
It involves on-the-fly latency estimation on the target hardware, accounting for closer latencies as compared to the specified budget.
Empirical results show that Archtree better preserves the original model accuracy while better fitting the latency budget as compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2023-11-17T14:24:12Z) - Autonomous Tree-search Ability of Large Language Models [58.68735916408101]
Large Language Models have excelled in remarkable reasoning capabilities with advanced prompting techniques.
Recent works propose to utilize external programs to define search logic, such that LLMs can perform passive tree search to solve more challenging reasoning tasks.
We propose a new concept called autonomous tree-search ability of LLM, which can automatically generate a response containing search trajectories for the correct answer.
arXiv Detail & Related papers (2023-10-14T14:14:38Z) - Flexible Channel Dimensions for Differentiable Architecture Search [50.33956216274694]
We propose a novel differentiable neural architecture search method with an efficient dynamic channel allocation algorithm.
We show that the proposed framework is able to find DNN architectures that are equivalent to previous methods in task accuracy and inference latency.
arXiv Detail & Related papers (2023-06-13T15:21:38Z) - RELS-DQN: A Robust and Efficient Local Search Framework for
Combinatorial Optimization [11.269582666887324]
We introduce RELS-DQN, a lightweight DQN framework that exhibits the local search behavior while providing practical scalability.
Using the RELS-DQN model trained on one application, it can generalize to various applications by providing solution values higher than or equal to both the local search algorithms and the existing DQN models.
arXiv Detail & Related papers (2023-04-11T18:01:49Z) - New Linear-time Algorithm for SubTree Kernel Computation based on
Root-Weighted Tree Automata [0.0]
We propose a new linear time algorithm based on the concept of weighted tree automata for SubTree kernel computation.
Key idea behind the proposed algorithm is to replace DAG reduction and nodes sorting steps.
Our approach has three major advantages: it is output-sensitive, it is free sensitive from the tree types (ordered trees versus unordered trees), and it is well adapted to any incremental tree kernel based learning methods.
arXiv Detail & Related papers (2023-02-02T13:37:48Z) - Towards Optimally Efficient Tree Search with Deep Learning [76.64632985696237]
This paper investigates the classical integer least-squares problem which estimates signals integer from linear models.
The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning.
We propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal estimation for the underlying simplified memory-bounded A* algorithm.
arXiv Detail & Related papers (2021-01-07T08:00:02Z) - ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse
Coding [86.40042104698792]
We formulate neural architecture search as a sparse coding problem.
In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search.
Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.
arXiv Detail & Related papers (2020-10-13T04:34:24Z) - MurTree: Optimal Classification Trees via Dynamic Programming and Search [61.817059565926336]
We present a novel algorithm for learning optimal classification trees based on dynamic programming and search.
Our approach uses only a fraction of the time required by the state-of-the-art and can handle datasets with tens of thousands of instances.
arXiv Detail & Related papers (2020-07-24T17:06:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.