PhysPose: Refining 6D Object Poses with Physical Constraints
- URL: http://arxiv.org/abs/2503.23587v1
- Date: Sun, 30 Mar 2025 20:52:17 GMT
- Title: PhysPose: Refining 6D Object Poses with Physical Constraints
- Authors: Martin Malenický, Martin Cífka, Médéric Fourmy, Louis Montaut, Justin Carpentier, Josef Sivic, Vladimir Petrik,
- Abstract summary: We introduce PhysPose, a novel approach that integrates physical reasoning into pose estimation.<n>By leveraging scene geometry, PhysPose refines pose estimates to ensure physical plausibility.<n>We demonstrate its impact in robotics by significantly improving success rates in a challenging pick-and-place task.
- Score: 26.47867548297063
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate 6D object pose estimation from images is a key problem in object-centric scene understanding, enabling applications in robotics, augmented reality, and scene reconstruction. Despite recent advances, existing methods often produce physically inconsistent pose estimates, hindering their deployment in real-world scenarios. We introduce PhysPose, a novel approach that integrates physical reasoning into pose estimation through a postprocessing optimization enforcing non-penetration and gravitational constraints. By leveraging scene geometry, PhysPose refines pose estimates to ensure physical plausibility. Our approach achieves state-of-the-art accuracy on the YCB-Video dataset from the BOP benchmark and improves over the state-of-the-art pose estimation methods on the HOPE-Video dataset. Furthermore, we demonstrate its impact in robotics by significantly improving success rates in a challenging pick-and-place task, highlighting the importance of physical consistency in real-world applications.
Related papers
- DynOPETs: A Versatile Benchmark for Dynamic Object Pose Estimation and Tracking in Moving Camera Scenarios [20.835782699441797]
This paper presents a novel dataset DynOPETs for object pose estimation and tracking in unconstrained environments.
Our efficient annotation method innovatively integrates pose estimation and pose tracking techniques to generate pseudo-labels.
The resulting dataset offers accurate pose annotations for dynamic objects observed from moving cameras.
arXiv Detail & Related papers (2025-03-25T13:13:44Z) - Targeted Hard Sample Synthesis Based on Estimated Pose and Occlusion Error for Improved Object Pose Estimation [9.637714330461037]
We propose a novel method of hard example synthesis that is model-agnostic.<n>We demonstrate an improvement in correct detection rate of up to 20% across several ROBI-dataset objects using state-of-the-art pose estimation models.
arXiv Detail & Related papers (2024-12-05T16:00:55Z) - Physically Compatible 3D Object Modeling from a Single Image [109.98124149566927]
We present a framework that transforms single images into 3D physical objects.<n>Our framework embeds physical compatibility into the reconstruction process.<n>It consistently enhances the physical realism of 3D models over existing methods.
arXiv Detail & Related papers (2024-05-30T21:59:29Z) - DeepSimHO: Stable Pose Estimation for Hand-Object Interaction via
Physics Simulation [81.11585774044848]
We present DeepSimHO, a novel deep-learning pipeline that combines forward physics simulation and backward gradient approximation with a neural network.
Our method noticeably improves the stability of the estimation and achieves superior efficiency over test-time optimization.
arXiv Detail & Related papers (2023-10-11T05:34:36Z) - RNNPose: Recurrent 6-DoF Object Pose Refinement with Robust
Correspondence Field Estimation and Pose Optimization [46.144194562841435]
We propose a framework based on a recurrent neural network (RNN) for object pose refinement.
The problem is formulated as a non-linear least squares problem based on the estimated correspondence field.
The correspondence field estimation and pose refinement are conducted alternatively in each iteration to recover accurate object poses.
arXiv Detail & Related papers (2022-03-24T06:24:55Z) - A Bayesian Treatment of Real-to-Sim for Deformable Object Manipulation [59.29922697476789]
We propose a novel methodology for extracting state information from image sequences via a technique to represent the state of a deformable object as a distribution embedding.
Our experiments confirm that we can estimate posterior distributions of physical properties, such as elasticity, friction and scale of highly deformable objects, such as cloth and ropes.
arXiv Detail & Related papers (2021-12-09T17:50:54Z) - Spatial Attention Improves Iterative 6D Object Pose Estimation [52.365075652976735]
We propose a new method for 6D pose estimation refinement from RGB images.
Our main insight is that after the initial pose estimate, it is important to pay attention to distinct spatial features of the object.
We experimentally show that this approach learns to attend to salient spatial features and learns to ignore occluded parts of the object, leading to better pose estimation across datasets.
arXiv Detail & Related papers (2021-01-05T17:18:52Z) - Kinematic-Structure-Preserved Representation for Unsupervised 3D Human
Pose Estimation [58.72192168935338]
Generalizability of human pose estimation models developed using supervision on large-scale in-studio datasets remains questionable.
We propose a novel kinematic-structure-preserved unsupervised 3D pose estimation framework, which is not restrained by any paired or unpaired weak supervisions.
Our proposed model employs three consecutive differentiable transformations named as forward-kinematics, camera-projection and spatial-map transformation.
arXiv Detail & Related papers (2020-06-24T23:56:33Z) - Occlusion resistant learning of intuitive physics from videos [52.25308231683798]
Key ability for artificial systems is to understand physical interactions between objects, and predict future outcomes of a situation.
This ability, often referred to as intuitive physics, has recently received attention and several methods were proposed to learn these physical rules from video sequences.
arXiv Detail & Related papers (2020-04-30T19:35:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.