Emotion Expression of Avatar through Eye Behaviors, Lip Synchronization and MPEG4 in Virtual Reality based on Xface Toolkit: Present and Future

Ahmad Hoirul Basori, Itimad Raheem Ali
2013 Procedia - Social and Behavioral Sciences  
Eye movement combined with lip synchronization, eye movements, and emotional facial expression revealed an interesting research field that gives information about f f verbal and nonverbal behaviors occurring in the human body. Most of the previous researchers focused on eyes gazes, lip synching and emotion expression which are the most important features that can transfer nonverbal information to enhance, understand or express emotion. In this paper, the recent advances in 3D facial expression
more » ... re introduced focusing on the presentation of Xface platform toolkit that developed a 3D talking avatars synthesis by implementing text-to-speech engine (TTS) to depict the basic lip shapes necessary for each phonemes to convey the dialogue. This work is believed to give the future direction that can lead into new research issue in facial animation.
doi:10.1016/j.sbspro.2013.10.290 fatcat:bjbsqftfl5bfnh3zc26v56wao4