Capturing performance based character animation in real time for unity

Hannes Wagner
2015 unpublished
Transferring an actor's movements directly onto a virtual 3D model is an effective approach to creating animations. But even better is being able to not only transfer movements, but also expressions onto the model. These make the animations seem even more realistic and natural to the viewer. But creating these kinds of high class animations is complex and time consuming. The goal of this thesis is the development of a functional prototype, that efficiently realizes such a performance capturing
more » ... ystem. It should enable the user to directly transfer complex motion sequences onto a 3D character by using live motion capturing, and if so desired, to even record these animations for later use. The recorded animations are supposed to be transferrable onto any humanoid 3D model. The capturing process can be segmented into three basic steps: the whole body, or rather the skeleton, of the user is tracked by a Kinect v2. Hand and finger movements are registered via 5DT Data Gloves. The face is recorded with a webcam, while feature points are extracted from it to facilitate the transfer of expressions and emotions onto the face of the model. The prototype was developed using the Unity game engine and will likely be available as an Asset Package to enable easy import. For the import to function correctly it is essential to use Unity version 5 or later, because otherwise the included plugins are not supported. The prototype was constructed in such a way, that some of the components are interchangeable, so it is possible to constantly keep adding new tracking methods. As a result it is possible to customize the prototype and even to easily extend and improve it. The prototype was performance tested with Unity's analytical tools and compared to other tracking solutions through a qualitative analysis. The results of these tests are documented in this thesis.
doi:10.25365/thesis.38441 fatcat:mxcdeqpssrfsvfkiqlr7htwyku