A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement
2017
IEEE Transactions on Affective Computing
In this paper we introduce a novel dataset, the Multimodal Human-Human-Robot-Interactions (MHHRI) dataset, with the aim of studying personality simultaneously in human-human interactions (HHI) and human-robot interactions (HRI) and its relationship with engagement. Multimodal data was collected during a controlled interaction study where dyadic interactions between two human participants and triadic interactions between two human participants and a robot took place with interactants asking a
doi:10.1109/taffc.2017.2737019
fatcat:vpaeaqp2nvdrbkncl6qga2tv4q