Motion Puzzle: Arbitrary Motion Style Transfer by Body Part
- URL: http://arxiv.org/abs/2202.05274v1
- Date: Thu, 10 Feb 2022 19:56:46 GMT
- Title: Motion Puzzle: Arbitrary Motion Style Transfer by Body Part
- Authors: Deok-Kyeong Jang, Soomin Park, Sung-Hee Lee
- Abstract summary: Motion Puzzle is a novel motion style transfer network that advances the state-of-the-art in several important respects.
Our framework extracts style features from multiple style motions for different body parts and transfers them locally to the target body parts.
It can capture styles exhibited by dynamic movements, such as flapping and staggering, significantly better than previous work.
- Score: 6.206196935093063
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents Motion Puzzle, a novel motion style transfer network that
advances the state-of-the-art in several important respects. The Motion Puzzle
is the first that can control the motion style of individual body parts,
allowing for local style editing and significantly increasing the range of
stylized motions. Designed to keep the human's kinematic structure, our
framework extracts style features from multiple style motions for different
body parts and transfers them locally to the target body parts. Another major
advantage is that it can transfer both global and local traits of motion style
by integrating the adaptive instance normalization and attention modules while
keeping the skeleton topology. Thus, it can capture styles exhibited by dynamic
movements, such as flapping and staggering, significantly better than previous
work. In addition, our framework allows for arbitrary motion style transfer
without datasets with style labeling or motion pairing, making many publicly
available motion datasets available for training. Our framework can be easily
integrated with motion generation frameworks to create many applications, such
as real-time motion transfer. We demonstrate the advantages of our framework
with a number of examples and comparisons with previous work.
Related papers
- MikuDance: Animating Character Art with Mixed Motion Dynamics [28.189884806755153]
We propose MikuDance, a diffusion-based pipeline incorporating mixed motion dynamics to animate character art.
Specifically, a Scene Motion Tracking strategy is presented to explicitly model the dynamic camera in pixel-wise space, enabling unified character-scene motion modeling.
A Motion-Adaptive Normalization module is incorporated to effectively inject global scene motion, paving the way for comprehensive character art animation.
arXiv Detail & Related papers (2024-11-13T14:46:41Z) - Decoupling Contact for Fine-Grained Motion Style Transfer [21.61658765014968]
Motion style transfer changes the style of a motion while retaining its content and is useful in computer animations and games.
It is unknown how to decouple and control contact to achieve fine-grained control in motion style transfer.
We present a novel style transfer method for fine-grained control over contacts while achieving both motion naturalness and spatial-temporal variations of style.
arXiv Detail & Related papers (2024-09-09T07:33:14Z) - Puppet-Master: Scaling Interactive Video Generation as a Motion Prior for Part-Level Dynamics [67.97235923372035]
We present Puppet-Master, an interactive video generative model that can serve as a motion prior for part-level dynamics.
At test time, given a single image and a sparse set of motion trajectories, Puppet-Master can synthesize a video depicting realistic part-level motion faithful to the given drag interactions.
arXiv Detail & Related papers (2024-08-08T17:59:38Z) - Monkey See, Monkey Do: Harnessing Self-attention in Motion Diffusion for Zero-shot Motion Transfer [55.109778609058154]
Existing diffusion-based motion editing methods overlook the profound potential of the prior embedded within the weights of pre-trained models.
We uncover the roles and interactions of attention elements in capturing and representing motion patterns.
We integrate these elements to transfer a leader motion to a follower one while maintaining the nuanced characteristics of the follower, resulting in zero-shot motion transfer.
arXiv Detail & Related papers (2024-06-10T17:47:14Z) - MotionCrafter: One-Shot Motion Customization of Diffusion Models [66.44642854791807]
We introduce MotionCrafter, a one-shot instance-guided motion customization method.
MotionCrafter employs a parallel spatial-temporal architecture that injects the reference motion into the temporal component of the base model.
During training, a frozen base model provides appearance normalization, effectively separating appearance from motion.
arXiv Detail & Related papers (2023-12-08T16:31:04Z) - TapMo: Shape-aware Motion Generation of Skeleton-free Characters [64.83230289993145]
We present TapMo, a Text-driven Animation Pipeline for Motion in a broad spectrum of skeleton-free 3D characters.
TapMo comprises two main components - Mesh Handle Predictor and Shape-aware Diffusion Module.
arXiv Detail & Related papers (2023-10-19T12:14:32Z) - Motion In-Betweening with Phase Manifolds [29.673541655825332]
This paper introduces a novel data-driven motion in-betweening system to reach target poses of characters by making use of phases variables learned by a Periodic Autoencoder.
Our approach utilizes a mixture-of-experts neural network model, in which the phases cluster movements in both space and time with different expert weights.
arXiv Detail & Related papers (2023-08-24T12:56:39Z) - Human MotionFormer: Transferring Human Motions with Vision Transformers [73.48118882676276]
Human motion transfer aims to transfer motions from a target dynamic person to a source static one for motion synthesis.
We propose Human MotionFormer, a hierarchical ViT framework that leverages global and local perceptions to capture large and subtle motion matching.
Experiments show that our Human MotionFormer sets the new state-of-the-art performance both qualitatively and quantitatively.
arXiv Detail & Related papers (2023-02-22T11:42:44Z) - Online Motion Style Transfer for Interactive Character Control [5.6151459129070505]
We propose an end-to-end neural network that can generate motions with different styles and transfer motion styles in real-time under user control.
Our approach eliminates the use of handcrafted phase features, and could be easily trained and directly deployed in game systems.
arXiv Detail & Related papers (2022-03-30T15:23:37Z) - Unpaired Motion Style Transfer from Video to Animation [74.15550388701833]
Transferring the motion style from one animation clip to another, while preserving the motion content of the latter, has been a long-standing problem in character animation.
We present a novel data-driven framework for motion style transfer, which learns from an unpaired collection of motions with style labels.
Our framework is able to extract motion styles directly from videos, bypassing 3D reconstruction, and apply them to the 3D input motion.
arXiv Detail & Related papers (2020-05-12T13:21:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.