A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit <a rel="external noopener" href="http://aiweb.techfak.uni-bielefeld.de/files/barthoc.pdf">the original URL</a>. The file type is <code>application/pdf</code>.
Human-Oriented Interaction With an Anthropomorphic Robot
<span title="">2007</span>
<i title="Institute of Electrical and Electronics Engineers (IEEE)">
<a target="_blank" rel="noopener" href="https://fatcat.wiki/container/dg2bd7saqzaj3e4myjqwziwpa4" style="color: black;">IEEE Transactions on robotics</a>
</i>
A very important aspect in developing robots capable of human-robot interaction (HRI) is the research in natural, human-like communication and subsequently the development of a research platform with multiple HRI-capabilities for evaluation. Besides a flexible dialog system and speech understanding an anthropomorphic appearance has the potential to support intuitive usage and understanding of a robot, e.g., human-like facial expressions and deictic gestures can be as well produced and also
<span class="external-identifiers">
<a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tro.2007.904903">doi:10.1109/tro.2007.904903</a>
<a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/gzjr3le4ifespeiwidyww4b2um">fatcat:gzjr3le4ifespeiwidyww4b2um</a>
</span>
more »
... stood by the robot. As a consequence of our effort in creating an anthropomorphic appearance and to come close to a human-human interaction model for a robot, we decided to use human-like sensors, i.e., two cameras and two microphones only, in analogy to human perceptual capabilities, too. Despite the challenges resulting from these limits with respect to perception, a robust attention system for tracking and interacting with multiple persons simultaneously in real-time is presented. The tracking approach is sufficiently generic to work on robots with varying hardware, as long as stereo audio data and images of a video camera are available. To easily implement different interaction capabilities like deictic gestures, natural adaptive dialogs, and emotion awareness on the robot we apply a modular integration approach utilizing XML based data exchange. The paper focuses on our efforts to bring together different interaction concepts and perception capabilities integrated on a humanoid robot to achieve comprehending human-oriented interaction.
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170808135352/http://aiweb.techfak.uni-bielefeld.de/files/barthoc.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext">
<button class="ui simple right pointing dropdown compact black labeled icon button serp-button">
<i class="icon ia-icon"></i>
Web Archive
[PDF]
<div class="menu fulltext-thumbnail">
<img src="https://blobs.fatcat.wiki/thumbnail/pdf/6a/71/6a715888e3709524655b2a1ae335697808e7e25f.180px.jpg" alt="fulltext thumbnail" loading="lazy">
</div>
</button>
</a>
<a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tro.2007.904903">
<button class="ui left aligned compact blue labeled icon button serp-button">
<i class="external alternate icon"></i>
ieee.com
</button>
</a>