Iterative Nadaraya-Watson Distribution Transfer for Colour Grading
- URL: http://arxiv.org/abs/2006.09208v1
- Date: Mon, 15 Jun 2020 00:14:03 GMT
- Title: Iterative Nadaraya-Watson Distribution Transfer for Colour Grading
- Authors: Hana Alghamdi and Rozenn Dahyot
- Abstract summary: We propose a new method that maps one N-dimensional distribution to another taking into account available information about correspondences.
We extend the 2D/3D problem to higher dimensions by encoding overlapping neighborhoods of data points and solve the high dimensional problem in 1D space.
- Score: 2.8790548120668573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new method with Nadaraya-Watson that maps one N-dimensional
distribution to another taking into account available information about
correspondences. We extend the 2D/3D problem to higher dimensions by encoding
overlapping neighborhoods of data points and solve the high dimensional problem
in 1D space using an iterative projection approach. To show potentials of this
mapping, we apply it to colour transfer between two images that exhibit
overlapped scene. Experiments show quantitative and qualitative improvements
over previous state of the art colour transfer methods.
Related papers
- NeuSD: Surface Completion with Multi-View Text-to-Image Diffusion [56.98287481620215]
We present a novel method for 3D surface reconstruction from multiple images where only a part of the object of interest is captured.
Our approach builds on two recent developments: surface reconstruction using neural radiance fields for the reconstruction of the visible parts of the surface, and guidance of pre-trained 2D diffusion models in the form of Score Distillation Sampling (SDS) to complete the shape in unobserved regions in a plausible manner.
arXiv Detail & Related papers (2023-12-07T19:30:55Z) - SketchSampler: Sketch-based 3D Reconstruction via View-dependent Depth
Sampling [75.957103837167]
Reconstructing a 3D shape based on a single sketch image is challenging due to the large domain gap between a sparse, irregular sketch and a regular, dense 3D shape.
Existing works try to employ the global feature extracted from sketch to directly predict the 3D coordinates, but they usually suffer from losing fine details that are not faithful to the input sketch.
arXiv Detail & Related papers (2022-08-14T16:37:51Z) - CorrI2P: Deep Image-to-Point Cloud Registration via Dense Correspondence [51.91791056908387]
We propose the first feature-based dense correspondence framework for addressing the image-to-point cloud registration problem, dubbed CorrI2P.
Specifically, given a pair of a 2D image before a 3D point cloud, we first transform them into high-dimensional feature space feed the features into a symmetric overlapping region to determine the region where the image point cloud overlap.
arXiv Detail & Related papers (2022-07-12T11:49:31Z) - Incorporating Texture Information into Dimensionality Reduction for
High-Dimensional Images [65.74185962364211]
We present a method for incorporating neighborhood information into distance-based dimensionality reduction methods.
Based on a classification of different methods for comparing image patches, we explore a number of different approaches.
arXiv Detail & Related papers (2022-02-18T13:17:43Z) - Refer-it-in-RGBD: A Bottom-up Approach for 3D Visual Grounding in RGBD
Images [69.5662419067878]
Grounding referring expressions in RGBD image has been an emerging field.
We present a novel task of 3D visual grounding in single-view RGBD image where the referred objects are often only partially scanned due to occlusion.
Our approach first fuses the language and the visual features at the bottom level to generate a heatmap that localizes the relevant regions in the RGBD image.
Then our approach conducts an adaptive feature learning based on the heatmap and performs the object-level matching with another visio-linguistic fusion to finally ground the referred object.
arXiv Detail & Related papers (2021-03-14T11:18:50Z) - Sliced $\mathcal{L}_2$ Distance for Colour Grading [1.6389581549801253]
We propose a new method with $mathcalL$ distance that maps one $N$-dimensional distribution to another.
We solve the high-dimensional problem in 1D space using an iterative projection approach.
Experiments show quantitative and qualitative competitive results as compared with the state of the art colour transfer methods.
arXiv Detail & Related papers (2021-02-18T12:17:18Z) - Grassmannian diffusion maps based dimension reduction and classification
for high-dimensional data [0.0]
novel nonlinear dimensionality reduction technique that defines the affinity between points through their representation as low-dimensional subspaces corresponding to points on the Grassmann manifold.
The method is designed for applications, such as image recognition and data-based classification of high-dimensional data that can be compactly represented in a lower dimensional subspace.
arXiv Detail & Related papers (2020-09-16T08:32:02Z) - Geometric Correspondence Fields: Learned Differentiable Rendering for 3D
Pose Refinement in the Wild [96.09941542587865]
We present a novel 3D pose refinement approach based on differentiable rendering for objects of arbitrary categories in the wild.
In this way, we precisely align 3D models to objects in RGB images which results in significantly improved 3D pose estimates.
We evaluate our approach on the challenging Pix3D dataset and achieve up to 55% relative improvement compared to state-of-the-art refinement methods in multiple metrics.
arXiv Detail & Related papers (2020-07-17T12:34:38Z) - Patch based Colour Transfer using SIFT Flow [2.8790548120668573]
We propose a new colour transfer method with Optimal Transport (OT) to transfer the colour of a sourceimage to match the colour of a target image.
Experiments show quantitative andqualitative improvements over previous state of the art colour transfer methods.
arXiv Detail & Related papers (2020-05-18T18:22:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.