Neural Kinematic Networks for Unsupervised Motion Retargetting

Ruben Villegas, Jimei Yang, Duygu Ceylan, Honglak Lee
2018 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition  
Time Target Character 2 Input Motion Target Character 1 Figure 1: Our end-to-end method retargets a given input motion (top row), to new characters with different bone lengths and proportions, (middle and bottom row). The target characters are never seen performing the input motion during training. Abstract We propose a recurrent neural network architecture with a Forward Kinematics layer and cycle consistency based adversarial training objective for unsupervised motion retargetting. Our
more » ... captures the high-level properties of an input motion by the forward kinematics layer, and adapts them to a target character with different skeleton bone lengths (e.g., shorter, longer arms etc.). Collecting paired motion training sequences from different characters is expensive. Instead, our network utilizes cycle consistency to learn to solve the Inverse Kinematics problem in an unsupervised manner. Our method works online, i.e., it adapts the motion sequence on-the-fly as new frames are received. In our experiments, we use the Mixamo animation data 1 to test our method for a variety of motions and characters and achieve state-of-the-art results. We also demonstrate motion retargetting from monocular human videos to 3D characters using an off-the-shelf 3D pose estimator.
doi:10.1109/cvpr.2018.00901 dblp:conf/cvpr/VillegasYCL18 fatcat:clvz2m7ssvfxrellhbfd2mfp5y