Neural Pose Transfer by Spatially Adaptive Instance Normalization
- URL: http://arxiv.org/abs/2003.07254v2
- Date: Fri, 29 May 2020 17:08:21 GMT
- Title: Neural Pose Transfer by Spatially Adaptive Instance Normalization
- Authors: Jiashun Wang, Chao Wen, Yanwei Fu, Haitao Lin, Tianyun Zou, Xiangyang
Xue, Yinda Zhang
- Abstract summary: We propose the first neural pose transfer model that solves the pose transfer via the latest technique for image style transfer.
Our model does not require any correspondences between the source and target meshes.
Experiments show that the proposed model can effectively transfer deformation from source to target meshes, and has good generalization ability to deal with unseen identities or poses of meshes.
- Score: 73.04483812364127
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Pose transfer has been studied for decades, in which the pose of a source
mesh is applied to a target mesh. Particularly in this paper, we are interested
in transferring the pose of source human mesh to deform the target human mesh,
while the source and target meshes may have different identity information.
Traditional studies assume that the paired source and target meshes are existed
with the point-wise correspondences of user annotated landmarks/mesh points,
which requires heavy labelling efforts. On the other hand, the generalization
ability of deep models is limited, when the source and target meshes have
different identities. To break this limitation, we proposes the first neural
pose transfer model that solves the pose transfer via the latest technique for
image style transfer, leveraging the newly proposed component -- spatially
adaptive instance normalization. Our model does not require any correspondences
between the source and target meshes. Extensive experiments show that the
proposed model can effectively transfer deformation from source to target
meshes, and has good generalization ability to deal with unseen identities or
poses of meshes. Code is available at
https://github.com/jiashunwang/Neural-Pose-Transfer .
Related papers
- Weakly-supervised 3D Pose Transfer with Keypoints [57.66991032263699]
Main challenges of 3D pose transfer are: 1) Lack of paired training data with different characters performing the same pose; 2) Disentangling pose and shape information from the target mesh; 3) Difficulty in applying to meshes with different topologies.
We propose a novel weakly-supervised keypoint-based framework to overcome these difficulties.
arXiv Detail & Related papers (2023-07-25T12:40:24Z) - ReorientDiff: Diffusion Model based Reorientation for Object
Manipulation [18.95498618397922]
The ability to manipulate objects in a desired configurations is a fundamental requirement for robots to complete various practical applications.
We propose a reorientation planning method, ReorientDiff, that utilizes a diffusion model-based approach.
The proposed method is evaluated using a set of YCB-objects and a suction gripper, demonstrating a success rate of 95.2% in simulation.
arXiv Detail & Related papers (2023-02-28T00:08:38Z) - Generalized One-shot Domain Adaption of Generative Adversarial Networks [72.84435077616135]
The adaption of Generative Adversarial Network (GAN) aims to transfer a pre-trained GAN to a given domain with limited training data.
We consider that the adaptation from source domain to target domain can be decoupled into two parts: the transfer of global style like texture and color, and the emergence of new entities that do not belong to the source domain.
Our core objective is to constrain the gap between the internal distributions of the reference and syntheses by sliced Wasserstein distance.
arXiv Detail & Related papers (2022-09-08T09:24:44Z) - RAIN: RegulArization on Input and Network for Black-Box Domain
Adaptation [80.03883315743715]
Source-free domain adaptation transits the source-trained model towards target domain without exposing the source data.
This paradigm is still at risk of data leakage due to adversarial attacks on the source model.
We propose a novel approach named RAIN (RegulArization on Input and Network) for Black-Box domain adaptation from both input-level and network-level regularization.
arXiv Detail & Related papers (2022-08-22T18:18:47Z) - 3D Pose Transfer with Correspondence Learning and Mesh Refinement [41.92922228475176]
3D pose transfer is one of the most challenging 3D generation tasks.
We propose a correspondence-refinement network to help the 3D pose transfer for both human and animal meshes.
arXiv Detail & Related papers (2021-09-30T11:49:03Z) - Transformer-Based Source-Free Domain Adaptation [134.67078085569017]
We study the task of source-free domain adaptation (SFDA), where the source data are not available during target adaptation.
We propose a generic and effective framework based on Transformer, named TransDA, for learning a generalized model for SFDA.
arXiv Detail & Related papers (2021-05-28T23:06:26Z) - Progressive and Aligned Pose Attention Transfer for Person Image
Generation [59.87492938953545]
This paper proposes a new generative adversarial network for pose transfer, i.e., transferring the pose of a given person to a target pose.
We use two types of blocks, namely Pose-Attentional Transfer Block (PATB) and Aligned Pose-Attentional Transfer Bloc (APATB)
We verify the efficacy of the model on the Market-1501 and DeepFashion datasets, using quantitative and qualitative measures.
arXiv Detail & Related papers (2021-03-22T07:24:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.