Peer Review #2 of "A method for creating interactive, user-resembling avatars (v0.1)" [peer_review]

2017 unpublished
Virtual reality (VR) applications have disseminated throughout several fields, with a special quest for immersion. The avatar is one of the key constituents of immersive applications, and avatar resemblance can provoke diverse emotional responses from the user. Yet, a lot a virtual reality systems struggle to implement real life-like avatars. In this work, we propose a novel method for creating interactive, user-resembling avatars using available commercial hardware and software. Avatar
more » ... are. Avatar visualization is possible with a point-cloud or a contiguous polygon surface, and avatar interactions with the virtual scenario happens through a body joint-approximation for contact. In addition, the implementation could be easily extended to other systems and its modular architecture admits improvement both on visualization and physical interactions. The code is under Apache License 2.0 and is freely available as supplemental material. PeerJ Comput. Sci. reviewing PDF | (CS-ABSTRACT 6 Virtual reality (VR) applications have disseminated throughout several fields, with a special quest for immersion. The avatar is one of the key constituents of immersive applications, and avatar resemblance can provoke diverse emotional responses from the user. Yet, a lot a virtual reality systems struggle to implement real life-like avatars. In this work, we propose a novel method for creating interactive, userresembling avatars using available commercial hardware and software. Avatar visualization is possible with a point-cloud or a contiguous polygon surface, and avatar interactions with the virtual scenario happens through a body joint-approximation for contact. In addition, the implementation could be easily extended to other systems and its modular architecture admits improvement both on visualization and physical interactions. The code is under Apache License 2.0 and is freely available as supplemental material. 7 8 9 10 11 12 13 14 15 16 (Didehbani et al., 2016). Other fields such as architecture and urban planning (Portman et al., 2015; Luigi 34 et al., 2015) and education (Abulrub et al., 2011) also benefit from the technology. 35 However, as advanced as those technologies are, there are a few limitations to each of them. The 36 technologies mentioned above, such as the HMDs and motion tracking devices, tackle a single piece of 37 the virtual reality interaction problem. The HMDs give visual support for the simulation while motion 38 tracking devices provide means for our body to interact with the virtual world. On one hand, most 39 HMDs deal exclusively with visual perceptions and head movement, ignoring completely any other body 40 movement. As a result, HMDs applications are usually static and display static, generic avatars, frustrating 41 any kind of user interaction other than through vision and head movement. Motion tracking devices, on 42 the other hand, allow for whole-body user interaction but limit the immersion experience because they do 43 not take into account the user's visual field. Therefore, users are limited on how they can interact with the 44 system, depending on which device they use. 45 A possible solution is to use the capabilities of both devices in an integrated hardware-software 46 PeerJ Comput. Sci. reviewing PDF | (CS-Manuscript to be reviewed Computer Science 81 Kinect or Asus Xtion. Thus, considering the specifications shown here, we choose to use in our system 82 the Kinect V2 as the main interface for depth sensing. 83 The first to appear on the market was the Kinect v1, in 2010, with the official software development 84 kit (SDK) released later on in 2011 (Microsoft, 2017a). In 2014, a second iteration of the Kinect was 85 released (Microsoft, 2017b). The Kinect V2 has a 1080p color camera operating at 30Hz in good light 86 conditions, a 512x424 depth sensing camera operating at 30Hz and 70x60 field of view, sensing from 0.5 87 to 4.5 meters, and active infrared capabilities with the same resolution. 88 With its pose tracking capabilities and affordability, the Kinect quickly made its way to the research 89 scene. Regarding accuracy for motion tracking, the Kinect is adequate for most scientific purposes, 90 including medical studies on posture and rehabilitation (Clark et al., 2012 (Clark et al., , 2015 Zhao et al., 2014) . In 91 particular, Otte et al. (2016) have shown that the Kinect V2 can derive clinical motion parameters with 92 comparable performance to that obtained by a gold standard motion capture system. 93 As for virtual reality headset, the Oculus Rift was for a period the only option available. But, 94 soon enough, companies started to unveil similar products and nowadays the Oculus Rift has a few 95 competitors. Some of them have yet to be commercially launched, such as the Fove and Star VR. Two 96 commercially available platforms are the Playstation VR and the HTC Vive. Both have very similar 97 hardware specifications to the Oculus Rift (PlaystationVR, 2017; DigitalTrends, 2017). The main 98 difference is the price range: while the HTC Vive is priced at U$ 799, the Playstation VR costs U$ 386 99 and the Oculus Rift is sold for U$ 499. Yet, the Oculus Rift has been in development since 2012 and has 100 2/17 PeerJ Comput. Sci. reviewing PDF | (CS-Manuscript to be reviewed Computer Science created a solid developer community. For this reason, the Oculus Rift is a sensible choice both in terms of 101 hardware specifications and development support. 102 The first release of the Oculus Rift, named DK1, was for supporters only and occurred in 2012. The 103 second iteration was released in 2014 as DK2, and the first commercial version was released in 2016. The 104 Oculus Rift DK2 improved the first iteration with a screen resolution of 960x1080p per eye, a refresh rate 105 of 75 Hz and persistence of about 2 to 3 ms. In addition, the head-mounted display has sensors to detect 106 head motion both internally and externally.
doi:10.7287/peerj-cs.128v0.1/reviews/2 fatcat:227rop47sfg2rpeew6is2dbzrq