A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
A Model for Multimodal Humanlike Perception based on Modular Hierarchical Symbolic Information Processing, Knowledge Integration, and Learning
2007
Proceedings of the 2nd International Conference on Bio-Inspired Models of Network Information and Computing Systems
Automatic surveillance systems as well as autonomous robots are technical systems which would profit from the ability of humanlike perception for effective, efficient, and flexible operation. In this article, a model for humanlike perception is introduced based on hierarchical modular fusion of multi-sensory data, symbolic information processing, integration of knowledge and memory, and learning. The model is inspired by findings from neuroscience. Information from diverse sensors is
doi:10.4108/icst.bionetics2007.2421
dblp:conf/bionetics/Velik07
fatcat:o7rml6ac2ze2tcx7ts4gkalogu