A. O. Rakova, N. V. Bilous
2020 Radìoelektronika, Ìnformatika, Upravlìnnâ  
Context. The direction of the human face vector is an indicator of human attention. It has many applications in our daily lives, such as human-computer interaction, teleconferencing, virtual reality and 3D sound rendering. Moreover, determining the position of the head can be used to compare the exercises performed by a person with a certain standard, which brings us to investigation of ways to efficiently track moves. Depth-camera based systems, frequently used for these purposes, have
more » ... poses, have significant drawbacks such as accuracy decreasing on the direct sunlight and necessity of additional equipment. The recognition from the two-dimensional image becomes more widespread and eliminates difficulties related to depth cameras which allows them to be used indoors and outdoors. Objective. The purpose of this work is creation of the method that will allow us to track human head moves and record only significant vectors of head direction. Methods. This paper suggests reference points method that decreases set of recorded vectors to minimal amount significant to describe head moves. It also investigates and compares existing methods for determining the vector of the face in terms of use in suggested approach. Results. Suggested reference points method shows ability to highly decrease set of head direction vectors that describe the move. According to the results of the study, regression-based methods showed significantly better accuracy and independence from light and partial face closure so they were chosen to be used as methods to get head direction vector in reference points approach. Conclusions. Research confirmed applicability of reference points method for human movements tracking and shown that methods of determining human head vector by two-dimensional image can compete in accuracy with RGBD-based methods. Thus combined with suggested approach these methods expose less restrictions in use than RGBD-based ones.
doi:10.15588/1607-3274-2020-3-11 fatcat:6tuqa7lfurc7vmwcfgif6nhn2i