A content search system for mobile devices based on user context recognition

Tomohiro Mashita, Daijiro Komaki, Mayu Iwata, Kentaro Shimatani, Hiroki Miyamoto, Takahiro Hara, Kiyoshi Kiyokawa, Haruo Takemura, Shojiro Nishio
2012 2012 IEEE Virtual Reality (VR)  
People carry around mobile devices all the time in their daily life and get various information from the Internet in various situations. When searching for information (content) by using mobile devices, users' activities (e.g., walking and standing) and their situations (e.g., commuting in the morning and going out downtown in the evening) often change and this change may affect their degree of concentration on the display of mobile devices and their information needs. Therefore, search systems
more » ... should provide users with an amount of information suitable for their activities and with a type of information suitable for their situations. In this paper, we present the design and implementation of a content search system considering mobile users' activities and situations, which aims to reduce users' load of operations in content searching. Our system recognizes user's activities and switches between two kinds of content search systems according to the user's activity: the location-based content search system runs when the user is standing, while the menu-based content search system runs when the user is walking. Both systems present information based on the user's situation. We also introduce a user's activity recognition method for mobile devices. This method classifies user's activity into standing, walking, and running using the sensors equipped in mobile devices.
doi:10.1109/vr.2012.6180950 dblp:conf/vr/MashitaKISMHKTN12 fatcat:2wrafplbhrc6lozzonuiuyx2xq