LoopNet: A Multitasking Few-Shot Learning Approach for Loop Closure in Large Scale SLAM
- URL: http://arxiv.org/abs/2507.15109v1
- Date: Sun, 20 Jul 2025 20:11:37 GMT
- Title: LoopNet: A Multitasking Few-Shot Learning Approach for Loop Closure in Large Scale SLAM
- Authors: Mohammad-Maher Nakshbandi, Ziad Sharawy, Sorin Grigorescu,
- Abstract summary: We tackle the two main problems of real-time SLAM systems: 1) loop closure detection accuracy and 2) real-time constraints on the embedded hardware.<n>Our LoopNet method is based on a multitasking variant of the classical ResNet architecture, adapted for online retraining on a dynamic visual dataset and optimized for embedded devices.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One of the main challenges in the Simultaneous Localization and Mapping (SLAM) loop closure problem is the recognition of previously visited places. In this work, we tackle the two main problems of real-time SLAM systems: 1) loop closure detection accuracy and 2) real-time computation constraints on the embedded hardware. Our LoopNet method is based on a multitasking variant of the classical ResNet architecture, adapted for online retraining on a dynamic visual dataset and optimized for embedded devices. The online retraining is designed using a few-shot learning approach. The architecture provides both an index into the queried visual dataset, and a measurement of the prediction quality. Moreover, by leveraging DISK (DIStinctive Keypoints) descriptors, LoopNet surpasses the limitations of handcrafted features and traditional deep learning methods, offering better performance under varying conditions. Code is available at https://github.com/RovisLab/LoopNet. Additinally, we introduce a new loop closure benchmarking dataset, coined LoopDB, which is available at https://github.com/RovisLab/LoopDB.
Related papers
- PROL : Rehearsal Free Continual Learning in Streaming Data via Prompt Online Learning [17.230781041043823]
We propose a novel prompt-based method for online continual learning (OCL) that includes 4 main components.<n>Our proposed method achieves significantly higher performance than the current SOTAs in CIFAR100, ImageNet-R, ImageNet-A, and CUB dataset.
arXiv Detail & Related papers (2025-07-16T15:04:46Z) - Visual Loop Closure Detection Through Deep Graph Consensus [7.744347341643204]
We introduce LoopGNN, a graph neural network architecture that estimates loop closure consensus by leveraging geometrics of visually similars retrieved through place recognition.<n>Our method yields high-precision estimates while maintaining high recall.
arXiv Detail & Related papers (2025-05-27T20:42:47Z) - Provably Efficient Online RLHF with One-Pass Reward Modeling [59.30310692855397]
We propose a one-pass reward modeling method that does not require storing the historical data and can be computed in constant time.<n>We provide theoretical guarantees showing that our method improves both statistical and computational efficiency.<n>We conduct experiments using Llama-3-8B-Instruct and Qwen2.5-7B-Instruct models on the Ultrafeedback-binarized and Mixture2 datasets.
arXiv Detail & Related papers (2025-02-11T02:36:01Z) - Learning-based Sketches for Frequency Estimation in Data Streams without Ground Truth [8.643366221221351]
We propose a more practical learning-based estimation framework namely UCL-sketch.<n>Online training via equivalent learning without ground truth, and highly scalable architecture with logical estimation buckets.<n>Results demonstrate that our method greatly outperforms existing state-of-the-art sketches regarding per-key accuracy and distribution.
arXiv Detail & Related papers (2024-12-04T14:00:50Z) - Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation [1.1279808969568252]
In appearance-based localization and mapping, loop closure detection is the process used to determinate if the current observation comes from a previously visited location or a new one.
This paper presents an online loop closure detection approach for large-scale and long-term operation.
arXiv Detail & Related papers (2024-07-22T00:13:00Z) - Computationally Budgeted Continual Learning: What Does Matter? [128.0827987414154]
Continual Learning (CL) aims to sequentially train models on streams of incoming data that vary in distribution by preserving previous knowledge while adapting to new data.
Current CL literature focuses on restricted access to previously seen data, while imposing no constraints on the computational budget for training.
We revisit this problem with a large-scale benchmark and analyze the performance of traditional CL approaches in a compute-constrained setting.
arXiv Detail & Related papers (2023-03-20T14:50:27Z) - Lightweight Salient Object Detection in Optical Remote Sensing Images
via Feature Correlation [93.80710126516405]
We propose a novel lightweight ORSI-SOD solution, named CorrNet, to address these issues.
By reducing the parameters and computations of each component, CorrNet ends up having only 4.09M parameters and running with 21.09G FLOPs.
Experimental results on two public datasets demonstrate that our lightweight CorrNet achieves competitive or even better performance compared with 26 state-of-the-art methods.
arXiv Detail & Related papers (2022-01-20T08:28:01Z) - Dynamic Network-Assisted D2D-Aided Coded Distributed Learning [59.29409589861241]
We propose a novel device-to-device (D2D)-aided coded federated learning method (D2D-CFL) for load balancing across devices.
We derive an optimal compression rate for achieving minimum processing time and establish its connection with the convergence time.
Our proposed method is beneficial for real-time collaborative applications, where the users continuously generate training data.
arXiv Detail & Related papers (2021-11-26T18:44:59Z) - JUMBO: Scalable Multi-task Bayesian Optimization using Offline Data [86.8949732640035]
We propose JUMBO, an MBO algorithm that sidesteps limitations by querying additional data.
We show that it achieves no-regret under conditions analogous to GP-UCB.
Empirically, we demonstrate significant performance improvements over existing approaches on two real-world optimization problems.
arXiv Detail & Related papers (2021-06-02T05:03:38Z) - LCDNet: Deep Loop Closure Detection for LiDAR SLAM based on Unbalanced
Optimal Transport [8.21384946488751]
We introduce the novel LCDNet that effectively detects loop closures in LiDAR point clouds.
LCDNet is composed of a shared encoder, a place recognition head that extracts global descriptors, and a relative pose head that estimates the transformation between two point clouds.
Our approach outperforms state-of-the-art techniques by a large margin even while dealing with reverse loops.
arXiv Detail & Related papers (2021-03-08T20:19:37Z) - 2nd Place Scheme on Action Recognition Track of ECCV 2020 VIPriors
Challenges: An Efficient Optical Flow Stream Guided Framework [57.847010327319964]
We propose a data-efficient framework that can train the model from scratch on small datasets.
Specifically, by introducing a 3D central difference convolution operation, we proposed a novel C3D neural network-based two-stream framework.
It is proved that our method can achieve a promising result even without a pre-trained model on large scale datasets.
arXiv Detail & Related papers (2020-08-10T09:50:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.