Multimodal interaction with an autonomous forklift

Andrew Correa, Matthew R. Walter, Luke Fletcher, Jim Glass, Seth Teller, Randall Davis
2010 Proceeding of the 5th ACM/IEEE international conference on Human-robot interaction - HRI '10  
We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with which a human supervisor can command the forklift using speech and sketch. Most current sketch interfaces treat the canvas as a blank slate. In contrast, our interface uses live and synthesized camera images from the forklift as a canvas, and augments them with object and obstacle information from the world. This connection
more » ... es users to "draw on the world," enabling a simpler set of sketched gestures. Our interface supports commands that include summoning the forklift and directing it to lift, transport, and place loads of palletized cargo. We describe an exploratory evaluation of the system designed to identify areas for detailed study. Our framework incorporates external signaling to interact with humans near the vehicle. The robot uses audible and visual annunciation to convey its current state and intended actions. The system also provides seamless autonomy handoff: any human can take control of the robot by entering its cabin, at which point the forklift can be operated manually until the human exits.
doi:10.1145/1734454.1734550 dblp:conf/hri/CorreaWFGTD10 fatcat:7dcbcyfzwvczphk4clxikhqkwq