HeadScan: A Wearable System for Radio-Based Sensing of Head and Mouth-Related Activities

Biyi Fang, Nicholas D. Lane, Mi Zhang, Fahim Kawsar
2016 2016 15th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN)  
The popularity of wearables continues to rise. However, possible applications, and even their raw functionality are constrained by the types of sensors that are currently available. Accelerometers and gyroscopes struggle to capture complex user activities. Microphones and image sensors are more powerful but capture privacy sensitive information. Physiological sensors are obtrusive to users as they often require skin contact and must be placed at certain body positions to function. In contrast,
more » ... adio-based sensing uses wireless radio signals to capture movements of different parts of the body, and therefore provides a contactless and privacy-preserving approach to detect and monitor human activities. In this paper, we contribute to the search for new sensing modalities for the next generation of wearable devices by exploring the feasibility of mobile radiobased human activity recognition. We believe radio-based sensing has the potential to fundamentally transform wearables as we currently know them. As the first step to achieve our vision, we have designed and developed HeadScan, a first-of-its-kind wearable for radio-based sensing of a number of human activities that involve head and mouth movements. HeadScan only requires a pair of small antennas placed on the shoulder and collar and one wearable unit worn on the arm or the belt of the user. Head-Scan uses the fine-grained CSI measurements extracted from radio signals and incorporates a novel signal processing pipeline that converts the raw CSI measurements into the targeted human activities. To examine the feasibility and performance of HeadScan, we have collected approximate 50.5 hours data from seven users. Our wide-ranging experiments include comparisons to a conventional skin-contact audio-based sensing approach to tracking the same set of head and mouth-related activities. Our experimental results highlight the enormous potential of our radio-based mobile sensing approach and provide guidance to future explorations.
doi:10.1109/ipsn.2016.7460677 dblp:conf/ipsn/FangLZK16 fatcat:yosr7ch5enavnlwyqq5wxweb3e