Filters








153 Hits in 7.2 sec

Artificial neural networks for spatial perception: Towards visual object localisation in humanoid robots

Jurgen Leitner, Simon Harding, Mikhail Frank, Alexander Forster, Jurgen Schmidhuber
2013 The 2013 International Joint Conference on Neural Networks (IJCNN)  
In this paper, we present our on-going research to allow humanoid robots to learn spatial perception.  ...  We are using artificial neural networks (ANN) to estimate the location of objects in the robot's environment.  ...  CONCLUSIONS In this paper we investigate artificial neural networks (ANN) and their applicability to the spatial perception problem in robots.  ... 
doi:10.1109/ijcnn.2013.6706819 dblp:conf/ijcnn/LeitnerHFFS13 fatcat:ivsoifv6pbflrayhhsjgtuolqe

Transferring spatial perception between robots operating in a shared workspace

Jurgen Leitner, Simon Harding, Mikhail Frank, Alexander Forster, Jurgen Schmidhuber
2012 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems  
While the iCub moves it observes the object, and a neural network then learns how to relate its pose and visual inputs to the object location.  ...  We use a Katana robotic arm to teach an iCub humanoid robot how to perceive the location of the objects it sees.  ...  RESULTS The trained neural networks allow to estimate the position of the object in 3D space, with a high enough accuracy to allow for grasping experiments.  ... 
doi:10.1109/iros.2012.6385642 dblp:conf/iros/LeitnerHFFS12 fatcat:olkudkitifhwbhz3i42yv5argm

Reactive Reaching and Grasping on a Humanoid - Towards Closing the Action-Perception Loop on the iCub

Jürgen Leitner, Mikhail Frank, Alexander Förster, Jürgen Schmidhuber
2014 Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics  
An important feature is that the system can avoid obstacles -other objects detected in the visual stream -while reaching for the intended target object.  ...  Its functionality is showcased by having our iCub humanoid robot pick-up objects from a table in front of it.  ...  FP7-IST-IP-231722, 'Intrinsically Motivated Cumulative Learning Versatile Robots' (IM-CLeVeR).  ... 
doi:10.5220/0005113401020109 dblp:conf/icinco/LeitnerFFS14 fatcat:iao4niifsbdr7iw5lum6hns7vm

39th European Conference on Visual Perception (ECVP) 2016 Barcelona

2016 Perception  
Artificial neural networks are inspired by the brain, and their computations could be implemented in biological neurons.  ...  and visual discomfort. [1] Li, Network-Comp Neural, 1999  ...  Humans, and possibly many other animals, use shading as a cue towards object-shape.  ... 
doi:10.1177/0301006616671273 fatcat:el2dmsuk6zd25pedijucuexl5m

Learning Spatial Object Localization from Vision on a Humanoid Robot

Jürgen Leitner, Simon Harding, Mikhail Frank, Alexander Förster, Jürgen Schmidhuber
2012 International Journal of Advanced Robotic Systems  
We present a combined machine learning and computer vision approach for robots to localize objects.  ...  It allows our iCub humanoid to quickly learn to provide accurate 3D position estimates (in the centimetre range) of objects seen.  ...  The authors would like to thank: Leo Pape (IDSIA) & Ugo Pattacini (IIT) for the Cartesian controller and stereo camera calibration of the iCub; Davide Migliore & Alexandre Bernardino (IST) for helping  ... 
doi:10.5772/54657 fatcat:n2tbhq2ze5cvtl35o2vluoczc4

The State of Lifelong Learning in Service Robots: Current Bottlenecks in Object Perception and Manipulation [article]

S. Hamidreza Kasaei, Jorik Melsen, Floris van Beers, Christiaan Steenkist, Klemen Voncina
2021 arXiv   pre-print
Service robots are appearing more and more in our daily life. The development of service robots combines multiple fields of research, from object perception to object manipulation.  ...  In such environments, no matter how extensive the training data used for batch learning, a robot will always face new objects.  ...  In object perception, it applies to object representation, differentiating between object descriptors that are either hand-crafted or trained by a neural network.  ... 
arXiv:2003.08151v3 fatcat:ks4t3qfq3bhszgbb6lzrwwq3iq

How instructions modify perception: An fMRI study investigating brain areas involved in attributing human agency

James Stanley, Emma Gowen, R. Christopher Miall
2010 NeuroImage  
Behavioural results suggested that agency instructions influenced participants' perceptions of the stimuli.  ...  The fMRI analysis indicated different functions within the paracingulate cortex: ventral paracingulate cortex was more active for human compared to computer agency instructed trials across all stimulus  ...  Acknowledgments We would like to thank Leif Johansson for recording the actions for the point-light animations; Zoe Kourtzi for the helpful discussions in the design stages of this process, and, along  ... 
doi:10.1016/j.neuroimage.2010.04.025 pmid:20398769 pmcid:PMC2887490 fatcat:ij7gfd3khff2je7tzqdnmpynti

The State of Lifelong Learning in Service Robots:

S. Hamidreza Kasaei, Jorik Melsen, Floris van Beers, Christiaan Steenkist, Klemen Voncina
2021 Journal of Intelligent and Robotic Systems  
AbstractService robots are appearing more and more in our daily life. The development of service robots combines multiple fields of research, from object perception to object manipulation.  ...  In such environments, no matter how extensive the training data used for batch learning, a robot will always face new objects.  ...  In object perception, it applies to object representation, differentiating between object descriptors that are either hand-crafted or trained by a neural network.  ... 
doi:10.1007/s10846-021-01458-3 fatcat:eeunivdvmrcg3piyx3r3pdsviq

Robot emotions generated and modulated by visual features of the environment

Aaron S.W. Wong, Steven Nicklin, Kenny Hong, Stephan K. Chalup, Peter Walla
2013 2013 IEEE Symposium on Computational Intelligence for Creativity and Affective Computing (CICAC)  
Pilot experiments demonstrate how a humanoid robot tries to learn through interaction with a human companion to express emotions associated with different environmental scenes in a (near) human-like manner  ...  A new and challenging task is to emulate emotional responses on a robot that are caused by visual stimuli, such that the robot's responses mirror that of the human user.  ...  In this study, the artificial neural network consists only of one layer of weights, i.e. with no hidden layer of units.  ... 
doi:10.1109/cicac.2013.6595215 dblp:conf/cicac/WongNHCW13 fatcat:tchqlpf5pjedxdawll6dphjvje

Enhanced Robot Speech Recognition Using Biomimetic Binaural Sound Source Localization

Jorge Davila-Chacon, Jindong Liu, Stefan Wermter
2018 IEEE Transactions on Neural Networks and Learning Systems  
More specifically, a robot orients itself toward the angle where the signal-to-noise ratio (SNR) of speech is maximized for one microphone before doing an ASR task.  ...  Then, a feedforward neural network is used to handle high levels of ego noise and reverberation in the signal. Finally, the sound signal is fed into an ASR system.  ...  An important advantage of our biomimetic neural representation of spatial cues is that it can be directly integrated with vision for audio-visual spatial attention [58] .  ... 
doi:10.1109/tnnls.2018.2830119 pmid:29993561 fatcat:wpyurosalfgs5aec5titoklkbq

Physiologie de la perception et de l'action

Alain Berthoz
2010 Annuaire du Collège de France : Résumé Des Cours et Travaux  
critical period for orientation plasticity in the cat visual cortex », PLoS One , 2009, 4:e5380.  ...  Ouvrages et chapitres d’ouvrages collectifs 94— Benchenane K., Zugaro M.B., Wiener S.I.: « Neural Bases of Spatial Learning and Memory », in Binder M.D, Hirokawa N., Windhorst U.  ... 
doi:10.4000/annuaire-cdf.358 fatcat:iqvywnguozhavcd3sfggppaxdu

Haptic SLAM for context-aware robotic hand prosthetics - simultaneous inference of hand pose and object shape using particle filters

Feryal M. P. Behbahani, Ruth Taunton, Andreas A. C. Thomik, A. Aldo Faisal
2015 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER)  
We present a computational model for haptic exploration and shape reconstruction derived from mobile robotics: simultaneous localisation and mapping (SLAM).  ...  In conjunction with tactile-enabled prostheses, this could allow for online object recognition and pose adaptation for more natural prosthetic control.  ...  Here, we present a physics model of a humanoid hand, with 21 degrees of freedom and draw parallels between the problem of simultaneous localisation and mapping (SLAM) in robotics [3] , [4] and mapping  ... 
doi:10.1109/ner.2015.7146724 dblp:conf/ner/BehbahaniTTF15 fatcat:u5cffkjw3zcxbhbzmfdgdverhy

2020 Index IEEE Robotics and Automation Letters Vol. 5

2020 IEEE Robotics and Automation Letters  
., +, LRA April 2020 676-682 Visual Object Search by Learning Spatial Context.  ...  ., +, LRA April 2020 1835-1842 Deep Neural Network Approach in Robot Tool Dynamics Identification for Bilateral Teleoperation.  ... 
doi:10.1109/lra.2020.3032821 fatcat:qrnouccm7jb47ipq6w3erf3cja

Modeling development of natural multi-sensory integration using neural self-organisation and probabilistic population codes

Johannes Bauer, Jorge Dávila-Chacón, Stefan Wermter
2014 Connection science  
In this paper, we propose a model of learning multisensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its  ...  We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration  ...  To validate our model, we show in Section 3 that the algorithm can be used effectively in a real-world scenario by testing it in a robotic audio-visual object localisation task and demonstrate that audio-visual  ... 
doi:10.1080/09540091.2014.971224 fatcat:nr4w7fbyczhmxkt5rcpzhox7dm

The Robot Vibrissal System: Understanding Mammalian Sensorimotor Co-ordination Through Biomimetics [chapter]

Tony J. Prescott, Ben Mitchinson, Nathan F. Lepora, Stuart P. Wilson, Sean R. Anderson, John Porrill, Paul Dean, Charles W. Fox, Martin J. Pearson, J. Charles Sullivan, Anthony G. Pipe
2015 Sensorimotor Integration in the Whisker System  
We also demonstrate how the appropriate co-ordination of these sub-systems, with a model of brain architecture, can give rise to integrated behaviour in a life-like whiskered robot.  ...  We consider the problem of sensorimotor co-ordination in mammals through the lens of vibrissal touch, and via the methodology of embodied computational neuroscienceÑusing biomimetic robots to synthesize  ...  Science project ÒDevelopment of motor-sensory strategies for vibrissal active touchÓ.  ... 
doi:10.1007/978-1-4939-2975-7_10 fatcat:v6332q6usjbfrip325kqmy7yfi
« Previous Showing results 1 — 15 out of 153 results