Filters








659 Hits in 6.6 sec

Automatic grasp selection using a camera in a hand prosthesis

Joseph DeGol, Aadeel Akhtar, Bhargava Manja, Timothy Bretl
2016 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)  
In this paper, we demonstrate how automatic grasp selection can be achieved by placing a camera in the palm of a prosthetic hand and training a convolutional neural network on images of objects with corresponding  ...  We achieve a grasp classification accuracy of 93.2% and show through real-time grasp selection that using a camera to augment current electromyography controlled prosthetic hands may be useful.  ...  We gratefully acknowledge the support of NVIDIA Corporation for the donation of the Tegra used for this research. We also thank Patrick Slade for helping install the camera in the hand.  ... 
doi:10.1109/embc.2016.7590732 pmid:28261002 pmcid:PMC5325038 fatcat:pbbxxnyr4rfele7wkfk5weomka

Artificial Perception and Semiautonomous Control in Myoelectric Hand Prostheses Increases Performance and Decreases Effort

Jeremy Mouchoux, Stefano Carisi, Strahinja Dosen, Dario Farina, Arndt F. Schilling, Marko Markovic
2021 IEEE Transactions on robotics  
support the user with automation while preparing the prosthesis for grasping.  ...  In this article, we developed a novel manmachine interface that endows a myoelectric prosthesis (MYO) with artificial perception, estimation of user intention, and intelligent control (MYO-PACE) to continuously  ...  right-hand side using a left-hand prosthesis).  ... 
doi:10.1109/tro.2020.3047013 fatcat:k3ubfy2pkzhozlhpogzd4b627m

Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis [article]

Federico Vasile, Elisa Maiettini, Giulia Pasquale, Astrid Florio, Nicolò Boccardo, Lorenzo Natale
2022 arXiv   pre-print
Among these, so-called eye-in-hand systems automatically control the hand aperture and pre-shaping before the grasp, based on visual input coming from a camera on the wrist.  ...  We consider the task of object grasping with a prosthetic hand capable of multiple grasp types.  ...  A more advanced eye-in-hand system prototype would have the RGB camera embedded into the palm of the hand prosthesis.  ... 
arXiv:2203.09812v1 fatcat:ffl3un5hafbytmxwl5ohppvzh4

Control of prehension for the transradial prosthesis: Natural-like image recognition system

Djordje Klisic, Milos Kostic, Strahinja Dosen, Dejan Popovic
2009 Journal of Automatic Control  
eye-in-hand, and a redundant system of cameras [7] .  ...  The zoomed-in sections show the estimated distance and selected grasp along with the lengths of the main axes used to select the type and size of the grasp.  ... 
doi:10.2298/jac0901027k fatcat:esqhqsd7dfhppp74whplhtojm4

A Hybrid 3D Printed Hand Prosthesis Prototype Based on sEMG and a Fully Embedded Computer Vision System

Maria Claudia F. Castro, Wellington C. Pinheiro, Glauco Rigolin
2022 Frontiers in Neurorobotics  
This study presents a new approach for an sEMG hand prosthesis based on a 3D printed model with a fully embedded computer vision (CV) system in a hybrid version.  ...  Using the Myoware board and a finite state machine, the user's intention, depicted by a myoelectric signal, starts the process, photographing the object, proceeding to the grasp/gesture classification,  ...  DeGol et al. (2016) proposed the inclusion of a CV based on a convolutional neural network (CNN) with an architecture based on the VGG-VeryDeep-16 in a prosthesis for the automatic selection of the grasp  ... 
doi:10.3389/fnbot.2021.751282 pmid:35140597 pmcid:PMC8818886 fatcat:dnaowhlcrvaclhpezs4ap5hwve

Noble Control Scheme for Prosthetic Hands through Spatial Understanding

Yunan He, Osamu Fukuda, Nobuhiko Yamaguchi, Hiroshi Okumura, Kohei Arai
2020 International Journal of Advanced Computer Science and Applications  
A novel control scheme for prosthetic hands through spatial understanding is proposed.  ...  With the help of IMU sensors, the relationship can be tracked and kept wherever the prosthetic hand moves even the object is out of the view range of the camera.  ...  The introduce of the accelerators and gyroscopes to the vision-based control system solves the camera-prosthesis coordination problem, which enables automatic orientation adjustment of prosthetic hand  ... 
doi:10.14569/ijacsa.2020.0111088 fatcat:6jybjjaviva5pi6bk3jma7giaq

Hardware-aware Affordance Detection for Application in Portable Embedded Systems

Edoardo Ragusa, Christian Gianoglio, Strahinja Dosen, Paolo Gastaldo
2021 IEEE Access  
for selecting a grasping strategy.  ...  Such a solution could be used to substantially improve computer vision based prosthesis control but it is also highly relevant for other applications (e.g., resource-constrained robotic systems).  ...  In this scheme, to grasp an object, the subject aligns the prosthesis camera towards the object and ''triggers'' the automatic control by generating a specific myoelectric signal (e.g., hand opening command  ... 
doi:10.1109/access.2021.3109733 fatcat:sfzwdxxhzrdc7groe2diyn6tze

Cognitive vision system for control of dexterous prosthetic hands: Experimental evaluation

Strahinja Dosen, Christian Cipriani, Milos Kostic, Marco Controzzi, Maria C Carrozza, Dejan B Popovic
2010 Journal of NeuroEngineering and Rehabilitation  
size; and 3) an embedded hand controller implements the selected grasp using closed-loop position/force control.  ...  The CVS was integrated into a hierarchical control structure: 1) the user triggers the system and controls the orientation of the hand; 2) a high-level controller automatically selects the grasp type and  ...  Acknowledgements This work is part of the research funded through the EC FP6 project "The Smart Bio-adaptive Hand Prosthesis (SmartHand)", Contract No: NMP4-CT-2006-0033423.  ... 
doi:10.1186/1743-0003-7-42 pmid:20731834 pmcid:PMC2940869 fatcat:bmnhsaaypnabnhdx76a3ghowci

Deep learning-based artificial vision for grasp classification in myoelectric hands

Ghazal Ghazaei, Ali Alameer, Patrick Degenaar, Graham Morgan, Kianoush Nazarpour
2017 Journal of Neural Engineering  
The goal of this work was to enable trans-radial amputees to use a simple, yet efficient, computer vision system to grasp and move common household objects with a two-channel myoelectric prosthetic hand  ...  We developed a deep learning-based artificial vision system to augment the grasp functionality of a commercial prosthesis.  ...  Došen et al [29, 30] demonstrated a dexterous hand with an integrated vision based control system. The user controlled the prosthesis hand and the activation of the camera with myoelectric signals.  ... 
doi:10.1088/1741-2552/aa6802 pmid:28467317 fatcat:vcxgvdiydfemvo5ldbu3ctxzv4

A Survey of Teleceptive Sensing for Wearable Assistive Robotic Devices

Nili E. Krausz, Levi J. Hargrove
2019 Sensors  
Several recent publications present teleception modalities integrated into control systems and provide preliminary results, for example, for performing hand grasp prediction or endpoint control of an arm  ...  In this paper, we summarize the recent and ongoing published work in this promising new area of research.  ...  The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.  ... 
doi:10.3390/s19235238 pmid:31795240 pmcid:PMC6928925 fatcat:amg527zq5jctha53wlas6xiwxy

Virtual Reality Environment for Simulating Tasks With a Myoelectric Prosthesis: An Assessment and Training Tool

Joris M. Lambrecht, Christopher L. Pulliam, Robert F. Kirsch
2011 Journal of Prosthetics and Orthotics  
The system acquires EMG commands and residual limb kinematics, simulates the prosthesis dynamics, and displays the combined residual limb and prosthesis movements in a virtual reality environment that  ...  A virtual Box and Block Test is demonstrated. Three normally-limbed subjects performed the simulated test using a sequential and a synchronous control method.  ...  Instead, to simplify our setup, an automatic camera tracking algorithm is used to keep the camera pointed towards the region of interest-assumed to be the hand.  ... 
doi:10.1097/jpo.0b013e318217a30c pmid:23476108 pmcid:PMC3589581 fatcat:ogsrwcmofvenhlmxgen7jwpfku

Proceedings of the first workshop on Peripheral Machine Interfaces: going beyond traditional surface electromyography

Claudio Castellini, Panagiotis Artemiadis, Michael Wininger, Arash Ajoudani, Merkur Alimusaj, Antonio Bicchi, Barbara Caputo, William Craelius, Strahinja Dosen, Kevin Englehart, Dario Farina, Arjan Gijsberts (+8 others)
2014 Frontiers in Neurorobotics  
The hand is automatically preshaped and AR feedback communicates the selected grasp parameters to the user.  ...  This could be used both to pre-shape the hand and to navigate the arm to reach and grasp the selected target object.  ...  The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with  ... 
doi:10.3389/fnbot.2014.00022 pmid:25177292 pmcid:PMC4133701 fatcat:qnp4coftovgsvco4ksycp7sxvq

Pattern recognition techniques applied to electric power signal processing

Ghazi Bousaleh, Mohamad Darwiche, Fahed Hassoun
2012 2012 6th International Conference on Sciences of Electronics, Technologies of Information and Telecommunications (SETIT)  
We achieve a 71.6% reduction in the execution time for each pregrasp and a 91% accuracy in the estimation of the position in obstacle-free movement of the fingers, but the performance of out approach reduces  ...  In this project, two ways of improving the performance of the prosthetic hand Robo-Limb, during the pre-shaping phase, are investigated with respect to the openloop and EMG control drawbacks.  ...  A simple vision-based system is proposed, where a camera mounted on the prosthesis captures the image of the desired object to be grasped and the hand automatically chooses and executes the suitable pre-grasp  ... 
doi:10.1109/setit.2012.6482018 fatcat:yt7glpjz4jhzrf5lbbclxxp6pa

Electrode-free visual prosthesis/exoskeleton control using augmented reality glasses in a first proof-of-technical-concept study

Simon Hazubski, Harald Hoppe, Andreas Otte
2020 Scientific Reports  
By using AR glasses equipped with a monocular camera, a marker attached to the prosthesis is tracked.  ...  In this paper, we present a new concept of complete visual control for a prosthesis, an exoskeleton or another end effector using augmented reality (AR) glasses presented for the first time in a proof-of-concept  ...  Although the hand is an important tool in everyday life, in many cases, the prosthesis is only actively used a fraction of the time each day.  ... 
doi:10.1038/s41598-020-73250-6 pmid:33004950 fatcat:bfap3i6vnvdwziqetuaxbo4rvi

Grasp Type Estimation for Myoelectric Prostheses using Point Cloud Feature Learning [article]

Ghazal Ghazaei, Federico Tombari, Nassir Navab, Kianoush Nazarpour
2019 arXiv   pre-print
an appropriate grasp type; using a deep network architecture based on 3D point clouds called PointNet.  ...  We approach these limitations by augmenting the prosthetic hands with an off-the-shelf depth sensor to enable the prosthesis to see the object's depth, record a single view (2.5-D) snapshot, and estimate  ...  CONCLUSION In this paper, an effective and efficient approach for augmenting a hand prosthesis with a depth sensor was presented.  ... 
arXiv:1908.02564v1 fatcat:4xoxe5qu35cdrcy65amurwjcoe
« Previous Showing results 1 — 15 out of 659 results