Smartphone-based Acoustic Indoor Space Mapping

Swadhin Pradhan, Ghufran Baig, Wenguang Mao, Lili Qiu, Guohai Chen, Bo Yang
2018 Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies  
Constructing a map of indoor space has many important applications, such as indoor navigation, VR/AR, construction, safety, facility management, and network condition prediction. Existing indoor space mapping requires special hardware (e.g., indoor LiDAR equipment) and well-trained operators. In this paper, we develop a smartphone-based indoor space mapping system that lets a regular user quickly map an indoor space by simply walking around while holding a phone in his/her hand. Our system
more » ... ately measures the distance to nearby reectors, estimates the user's trajectory, and pairs dierent reectors the user encounters during the walk to automatically construct the contour. Using extensive evaluation, we show our contour construction is accurate: the median errors are 1.5 cm for a single wall and 6 cm for multiple walls (due to longer trajectory and the higher number of walls). We show that our system provides a median error of 30 cm and a 90-percentile error of 1 m, which is signicantly better than the state-of-the-art smartphone acoustic mapping system BatMapper [64], whose corresponding errors are 60 cm and 2.5 m respectively, even after multiple walks. We further show that the constructed indoor contour can be used to predict wireless received signal strength (RSS). 75:2 • S. Pradhan et al. Housecraft [9], AR MeasureKit [5] etc. These applications enable users to try out furniture, to achieve seamless home renovation, and to experience immersive games (which blends AR objects with the user's surrounding). These applications demand contextual structural information of the indoor space. In addition, home surveillance robots and autonomous home cleaning equipment require indoor maps to enhance their accuracy and coverage; and 360 videos also require indoor mapping to provide realistic and engaging experience. Furthermore, structural information allows us to accurately predict signal propagation and yield accurate estimation of wireless signals, which can be used for optimizing access points (AP) placement, selection, and rate adaptation. These applications call for a fast, low-cost, and easy-to-use indoor mapping system and can tolerate errors of a few centimeters. The existing approaches generate indoor maps either manually or using specialized sensors (e.g., laser rangers [57], depth cameras [41], sonars [29, 32] ). The high cost of these approaches signicantly limits the availability of indoor maps. Recently, some works [21, 26, 31, 34, 38] use crowd-sourcing to reduce deployment cost. However, crowd-sourcing incurs signicant overhead and takes much longer to get the indoor maps. Meanwhile, it also raises incentive and privacy issues. In terms of the types of techniques, cameras, LiDAR, and specialized sensors are often used for map construction. Vision-based approaches provide detailed indoor maps, but are computationally expensive, sensitive to lighting conditions and image quality, and raise privacy concerns. LiDAR are still costly and have trouble with transparent materials (e.g., windows, glass doors), which is common in indoor settings. Microsoft Hololens [14], Google's Project Tango tablet [18], Oculus VR headset [15]) combine multi-camera and multi-depth sensors to improve accuracy. However, they are costly and have slow adoption [11, 13]. Our Approach. Inspired by many applications of indoor mapping and the lack of widely available tools, we develop a novel acoustic-based system for indoor map construction. It has several distinct benets over the existing solutions: (i) it is low-cost and can be implemented on a smartphone without any extra hardware, and (ii) it is robust to ambient lighting conditions and transparent materials. In our approach, we let a smartphone emit audio signals and analyze the signals reected from the environment to infer the structure of indoor space. Our solution provides an infrastructure-free system to get the depth information by utilizing built-in speakers and microphones on smartphones. We develop a system, called SAMS (Smartphone Acoustic Mapping System), to get the indoor contour by just moving around while holding a smartphone. Specically, SAMS applies Frequency Modulated Continuous Wave (FMCW) technology to estimate the distances to the nearby objects (e.g., walls, doors, and shelves). We nd that existing audio based tracking works [45, 48, 62] focus on getting only the shortest path or one moving path. The latter is achieved by nding the dierence between consecutive samples to cancel out static multi-path. In comparison, we explicitly leverage multiple peaks in the FMCW prole to get critical structural information like corners or clutters, which are important for accurate map construction. Furthermore, we employ customized Inertial Measurement Unit (IMU) sensor based dead-reckoning combined with these distance measurements and systematic geometric constraints to derive contour of the surroundings in a calibration-free manner. We generalize our approach from a single wall setting to a multi-wall setting, account for clutters, and support both straight and curved surfaces. As one can imagine, SAMS can readily help in enabling interesting applications like navigation for blind people (by detecting obstacles), enriching AR/VR applications, or helping in indoor construction (as illustrated in Fig. 1) . SAMS can even provide semantic information of physical spaces (e.g., corners, corridors) by analyzing reected acoustic proles, and furthermore, the mapping from SAMS can help in wireless signal strength prediction. We demonstrate this utility of our system by feeding the constructed indoor map to predict wireless received signal strength (RSS). We nd that it yields accurate RSS prediction (within 1.5-2 dB error) and thus can help in application performance improvement. Another use-case might be to augment light-based distance estimation technique employed in Hololens or small LiDAR [1], in the cases of transparent or glass-like material in indoor. This acoustic based distance estimates will help these devices to create more accurate surrounding mapping templates.
doi:10.1145/3214278 fatcat:tbl6kcmjozbjrgmw6hbgqunhda