Filters








8,336 Hits in 5.9 sec

Hand Gesture Recognition based on Near-infrared Sensing Wristband

Andualem Maereg, Yang Lou, Emanuele Secco, Raymond King
<span title="">2020</span> <i title="SCITEPRESS - Science and Technology Publications"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/o73z4tintnbufhho5cqxzxeetq" style="color: black;">Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications</a> </i> &nbsp;
In this paper, we present a low-cost gesture sensing system that utilizes near Infrared Emitters (600 -1100 nm) and Photo-Receivers encompassing the wrist to infer hand gestures.  ...  Wrist-worn gesture sensing systems can be used as a seamless interface for AR/VR interactions and control of various devices.  ...  INTRODUCTION Hand gesture recognition refers to the problem of identifying hand gestures executed by a user at a specific time.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.5220/0008909401100117">doi:10.5220/0008909401100117</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/grapp/MaeregLSK20.html">dblp:conf/grapp/MaeregLSK20</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/j2uzbqjlczfelhcw547yfjrkqm">fatcat:j2uzbqjlczfelhcw547yfjrkqm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201106084413/https://hira.hope.ac.uk/id/eprint/2979/1/HUCAPP_2020_4.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/69/b3/69b3a540c590908af919b3ceacf07522d3f0bd6c.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.5220/0008909401100117"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Vision-Based Gesture Recognition: A Review [chapter]

Ying Wu, Thomas S. Huang
<span title="">1999</span> <i title="Springer Berlin Heidelberg"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/2w3awgokqne6te4nvlofavy5a4" style="color: black;">Lecture Notes in Computer Science</a> </i> &nbsp;
The use of gesture as a natural interface serves as a motivating force for research in modeling, analyzing and recognition of gestures.  ...  A survey on recent vision-based gesture recognition approaches is given in this paper. We shall review methods of static hand posture and temporal gesture recognition.  ...  The use of human movements, especially hand gestures, has become an important part of HCII in recent years, which serves as a motivating force for research in modeling, analyzing and recognition of hand  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1007/3-540-46616-9_10">doi:10.1007/3-540-46616-9_10</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/f64bukvzgvgovggdeg645hkjcq">fatcat:f64bukvzgvgovggdeg645hkjcq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20131228134843/http://cs.iupui.edu:80/~tuceryan/pdf-repository/Wu1999.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/7d/5e/7d5e6e904285b98fad5f3cfc7f247fb935feab9d.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1007/3-540-46616-9_10"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> springer.com </button> </a>

Spectral Collaborative Representation based Classification for hand gestures recognition on electromyography signals

Ali Boyali, Naohisa Hashimoto
<span title="">2016</span> <i title="Elsevier BV"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/jk3pblxy6rgufncpryy2osctie" style="color: black;">Biomedical Signal Processing and Control</a> </i> &nbsp;
In this study, we introduce a novel variant and application of the Collaborative Representation based Classification (CRC) in spectral domain for recognition of hand gestures using raw surface electromyography  ...  The worst recognition result which is the best in the literature is obtained as 97.3% among the four sets of the experiments for each hand gestures.  ...  We collected gesture couples for all of the classes represented in the training dictionary. The hand performs two gestures repeatedly for the gesture couples.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1016/j.bspc.2015.09.001">doi:10.1016/j.bspc.2015.09.001</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/v7y24detd5dl7at3xvvhivbf6a">fatcat:v7y24detd5dl7at3xvvhivbf6a</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20171005095919/http://publisher-connector.core.ac.uk/resourcesync/data/elsevier/pdf/f14/aHR0cDovL2FwaS5lbHNldmllci5jb20vY29udGVudC9hcnRpY2xlL3BpaS9zMTc0NjgwOTQxNTAwMTQ5NA%3D%3D.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/51/4e/514e6dc49dd2804d6caa83088fd8b188671f82b0.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1016/j.bspc.2015.09.001"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> elsevier.com </button> </a>

Whole hand modeling using 8 wearable sensors

Christopher-Eyk Hrabia, Katrin Wolf, Mathias Wilhelm
<span title="">2013</span> <i title="ACM Press"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/wqoakkhwrfes3irhlqvvzj7zcq" style="color: black;">Proceedings of the 4th Augmented Human International Conference on - AH &#39;13</a> </i> &nbsp;
Our hand model could potentially serve for rich handmodel-based gestural interaction as it covers all 26 DOF in the human hand.  ...  As modeling the whole hand has many advantages (e.g. for complex gesture detection) we aim for modeling the whole hand while at the same time keeping the hand's natural degrees of freedom (DOF) and the  ...  Hand model improvement We are able to eliminate the need for attached to the fingertips, by using the linear angular relationship between the two upper finger joints for the four fingers without the thumb  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/2459236.2459241">doi:10.1145/2459236.2459241</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/aughuman/HrabiaWW13.html">dblp:conf/aughuman/HrabiaWW13</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/q2wrjdjafnfhbpyirbfel4e7pq">fatcat:q2wrjdjafnfhbpyirbfel4e7pq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20150220162913/http://katrinwolf.info/wp-content/uploads/2013/05/AU13_fullHandModel.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/2f/fe/2ffe1927f3a3ccff490619928fd5b83678a5bdb8.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/2459236.2459241"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> acm.org </button> </a>

Hand modeling, analysis and recognition

Ying Wu, T.S. Huang
<span title="">2001</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/txj4mzfpbne4hhwgo2ku5cycbq" style="color: black;">IEEE Signal Processing Magazine</a> </i> &nbsp;
Three-dimensional hand models offer a rich description to fully capture hand motion. Static hand posture recognition and temporal gesture recognition are the two main parts of gesture recognition.  ...  Gesture Recognition by HMM The HMM is a type of statistical model widely used in speech recognition.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/79.924889">doi:10.1109/79.924889</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ynpygvmztrgx3opq5nhpbfr5ry">fatcat:ynpygvmztrgx3opq5nhpbfr5ry</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170808131155/http://www.ifp.illinois.edu/~yingwu/papers/IEEE_MSP.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/07/ea/07ea413c770644d5e1fb8f0749ef570bde99b47e.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/79.924889"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Type-hover-swipe in 96 bytes

Stuart Taylor, Cem Keskin, Otmar Hilliges, Shahram Izadi, John Helmes
<span title="">2014</span> <i title="ACM Press"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/4obeqfuqs5gzbos5ah64a7m57a" style="color: black;">Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI &#39;14</a> </i> &nbsp;
We detail hardware and gesture recognition algorithm, provide accuracy results, and demonstrate a large set of gestures designed to be performed with the device.  ...  Together these form a motion signature (E+F) which can be used to robustly recognize a number of dynamic on-keyboard (G) and hover gestures (H) using a machine learning-based classifier.  ...  These could be used to switch between two modes in an application.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/2556288.2557030">doi:10.1145/2556288.2557030</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/chi/TaylorKHIH14.html">dblp:conf/chi/TaylorKHIH14</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/7riu5o3xf5bvhj6q4tx7lbtozu">fatcat:7riu5o3xf5bvhj6q4tx7lbtozu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170809144823/https://ait.ethz.ch/projects/2014/96Bytes/downloads/p1695-taylor(MotionKeyboard).pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/da/71/da71286a0986a8a1864876667811dd92845282f0.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/2556288.2557030"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> acm.org </button> </a>

Hand Gesture Modeling Using Dynamic Bayesian Networks and Deformable Templates

Artur Wilkowski, Wlodzimierz Kasprzak
<span title="">2011</span> <i title="IEEE"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/xuuaxgbe4rhazjaqjhtij5sybu" style="color: black;">2011 Seventh International Conference on Signal Image Technology &amp; Internet-Based Systems</a> </i> &nbsp;
The Deformable Templates methodology is applied for hand shape modeling. Experimental evaluation of articulated hand tracking in cluttered environment using particle filtering is provided.  ...  The gesture model is given in terms of a Dynamic Bayesian network that incorporates a Hidden Markov Model in order to utilize prior information on gesture structure in the tracking task.  ...  In [11] the DBN formalism was used for the solution of a two-hand gesture recognition problem.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/sitis.2011.55">doi:10.1109/sitis.2011.55</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/sitis/WilkowskiK11.html">dblp:conf/sitis/WilkowskiK11</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/uegm3tfsnbb63dncztmhi4kp3y">fatcat:uegm3tfsnbb63dncztmhi4kp3y</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170829174008/http://staff.elka.pw.edu.pl/~wkasprza/PAP/SITIS2011.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/4a/25/4a25d73582dde84c93a0f3e2e3b2d3ec9342de21.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/sitis.2011.55"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Artificial Neural Network Optimization with Levenberg–Maruardt Algorithm for Dynamic Gesture Recognition

Stephen John Dy, Matthew Adrianne Gonzales, Lenard Lozano, Miguel Angelo Suniga, Alexander Abad
<span title="2018-07-27">2018</span> <i title="Science Publishing Corporation"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/piy2nrvrjrfcfoz5nmre6zwa4i" style="color: black;">International Journal of Engineering &amp; Technology</a> </i> &nbsp;
The study focuses on the use of the Levenberg Marquardt Algorithm as an optimization algorithm for a multilayer Artificial Neural Network in constructing a predictive model for dynamic gestures.  ...  The study concludes that the network architecture is adequate for gesture recognition, with an average recognition rate of 83%, but a larger data set may show to improve this value.  ...  Salle University -Manila (DLSU), De La Salle University -Laguna Campus, and Department of Science and Technology -Engineering Research and Development for Technology (DOST-ERDT) for funding and helping us  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.14419/ijet.v7i3.13.16312">doi:10.14419/ijet.v7i3.13.16312</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/kjtln62xrfg35nkjbu2iyfhiue">fatcat:kjtln62xrfg35nkjbu2iyfhiue</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190426221826/https://www.sciencepubco.com/index.php/ijet/article/download/16312/6911" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/f5/ff/f5ffc4d686635c86f8ef77208c2688dafad3f803.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.14419/ijet.v7i3.13.16312"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>

CapBand

Hoang Truong, Tam Vu, Shuo Zhang, Ufuk Muncuk, Phuc Nguyen, Nam Bui, Anh Nguyen, Qin Lv, Kaushik Chowdhury, Thang Dinh
<span title="">2018</span> <i title="ACM Press"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/5j2fcrioqzharcoeonnjm7exwq" style="color: black;">Proceedings of the 16th ACM Conference on Embedded Networked Sensor Systems - SenSys &#39;18</a> </i> &nbsp;
We build a wrist muscles-to-gesture model, based on which we develop a hand gesture classication method using both motion and static features.  ...  We present CapBand, a battery-free hand gesture recognition wearable in the form of a wristband.  ...  This work is supported in part by the US National Science Foundation (NSF) through grant CNS 1619392 and 1528138.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/3274783.3274854">doi:10.1145/3274783.3274854</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/sensys/TruongZMNBNLCDV18.html">dblp:conf/sensys/TruongZMNBNLCDV18</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/gs6apsivnvfb7pnx3umkwh2wzq">fatcat:gs6apsivnvfb7pnx3umkwh2wzq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190203170043/http://mnslab.org:80/anhnguyen/papers/SenSys-18-CapBand.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/10/ab/10abe995418d3b0bfea1df606b2fdddf2b7e9df5.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/3274783.3274854"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> acm.org </button> </a>

Integrating spatial sensing to an interactive mobile 3D map

V. Lehtinen, A. Nurminen, A. Oulasvirta
<span title="">2012</span> <i title="IEEE"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/yhgexhggvrg4nn7uzlw3wl7q7a" style="color: black;">2012 IEEE Symposium on 3D User Interfaces (3DUI)</a> </i> &nbsp;
The two techniques showed differential benefits for target acquisition performance.  ...  The technique offers easy shifting from the viewport-coupled mode to a top-down view where movement is POI-based.  ...  Previously reported solutions have offered two very different modes of interaction for visible and distant targets, such as a radar on the bottom of an AR display [2] , or gestures that switch between  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/3dui.2012.6184177">doi:10.1109/3dui.2012.6184177</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/3dui/LehtinenNO12.html">dblp:conf/3dui/LehtinenNO12</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/6oviiwpekng7jfa73ti4jcfpzm">fatcat:6oviiwpekng7jfa73ti4jcfpzm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170922022940/https://people.mpi-inf.mpg.de/~oantti/pubs/3dui-spatialsensing.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/db/56/db56f6d60ccf031109365eebf2af47375aea47ea.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/3dui.2012.6184177"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Simultaneous Hand Gesture Classification and Finger Angle Estimation via a Novel Dual-Output Deep Learning Model

Qinghua Gao, Shuo Jiang, Peter B. Shull
<span title="2020-05-24">2020</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/taedaf6aozg7vitz5dpgkojane" style="color: black;">Sensors</a> </i> &nbsp;
We thus propose a dual-output deep learning model to enable simultaneous hand gesture classification and finger angle estimation.  ...  Ten subjects performed experimental testing by flexing/extending each finger at the metacarpophalangeal joint while the proposed model was used to classify each hand gesture and estimate continuous finger  ...  For hand gesture recognition with deep learning, Geng et al. [32] used EMG images and a deep convolutional network for an 8-gesture within-subject recognition.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/s20102972">doi:10.3390/s20102972</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/32456330">pmid:32456330</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/mxwfqgg2bnhtrbwk6rfyyw5kma">fatcat:mxwfqgg2bnhtrbwk6rfyyw5kma</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200526031815/https://res.mdpi.com/d_attachment/sensors/sensors-20-02972/article_deploy/sensors-20-02972.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/8b/7e/8b7e699accca114adbb969e87cae8453db512d34.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/s20102972"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

Gesture Recognition: A Survey

Sushmita Mitra, Tinku Acharya
<span title="">2007</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/yppgahlq6ndebeq5yvdc23slyy" style="color: black;">IEEE Transactions on Systems Man and Cybernetics Part C (Applications and Reviews)</a> </i> &nbsp;
In this paper, we provide a survey on gesture recognition with particular emphasis on hand gestures and facial expressions.  ...  Gesture recognition pertains to recognizing meaningful expressions of motion by a human, involving the hands, arms, face, head, and/or body.  ...  HMMs for Hand Gesture Recognition HMM is a rich tool used for hand gesture recognition in diverse application domains.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tsmcc.2007.893280">doi:10.1109/tsmcc.2007.893280</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/iywyfj465zgfnp4n2o7usgcsne">fatcat:iywyfj465zgfnp4n2o7usgcsne</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20140308233836/http://cs.nccu.edu.tw:80/~whliao/hcie2007/gesture_recognition.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/80/f6/80f67ec9d8005c70cb9e18cf58195c4abe32deef.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tsmcc.2007.893280"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Accurate Personal Identification by Hand Gesture Recognition

Shilpa K N
<span title="2016-05-29">2016</span> <i title="Valley International"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/vdctvzhdqvc4jpfrnfd2m544qa" style="color: black;">INTERNATIONAL JOURNAL OF EMERGING TRENDS IN SCIENCE AND TECHNOLOGY</a> </i> &nbsp;
Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition.  ...  Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms.  ...  HAND MODELING-GESTURE RECOGNITION Human hand is an articulated object with 27 bones and 5 fingers. Each of these fingers consists of three joints.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.18535/ijetst/v3i05.27">doi:10.18535/ijetst/v3i05.27</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/57yyodbjkbfwfnfffaqcjszw2u">fatcat:57yyodbjkbfwfnfffaqcjszw2u</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20180602210104/http://ijetst.in/article/v3-i5/27%20ijetst.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/fc/7d/fc7d21a54cbd53a1767da1c304dd7c2822f8c1b6.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.18535/ijetst/v3i05.27"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>

EMI Spy: Harnessing electromagnetic interference for low-cost, rapid prototyping of proxemic interaction

Nan Zhao, Gershon Dublon, Nicholas Gillian, Artem Dementyev, Joseph A. Paradiso
<span title="">2015</span> <i title="IEEE"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/cpz45fh55renximzow6wama4ee" style="color: black;">2015 IEEE 12th International Conference on Wearable and Implantable Body Sensor Networks (BSN)</a> </i> &nbsp;
We demonstrate the feasibility of mobile, EMI-based device and gesture recognition with preliminary user studies in 3 scenarios, achieving 96% classification accuracy at close range for 6 digital signage  ...  We present a wearable system that uses ambient electromagnetic interference (EMI) as a signature to identify electronic devices and support proxemic interaction.  ...  These signals can also be used to enable gesture and touch recognition on existing devices.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/bsn.2015.7299402">doi:10.1109/bsn.2015.7299402</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/bsn/ZhaoDGDP15.html">dblp:conf/bsn/ZhaoDGDP15</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/m5xc6p7mjfdalbxmmkoxpsaram">fatcat:m5xc6p7mjfdalbxmmkoxpsaram</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170923023358/http://dspace.mit.edu/bitstream/handle/1721.1/103785/emi-spy-harnessing.pdf?sequence=1" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/d8/c0/d8c0236622b3d8def652ee1b76683930a5104d89.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/bsn.2015.7299402"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

zSense

Anusha Withana, Roshan Peiris, Nipuna Samarasekara, Suranga Nanayakkara
<span title="">2015</span> <i title="ACM Press"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/4obeqfuqs5gzbos5ah64a7m57a" style="color: black;">Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI &#39;15</a> </i> &nbsp;
ABSTRACT In this paper we present zSense, which provides greater input expressivity for spatially limited devices such as smart wearables through a shallow depth gesture recognition system using non-focused  ...  Our evaluations reported over 94.8% gesture recognition accuracy across all configurations.  ...  All of them were right-handed and used their dominant hand to perform the gestures.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/2702123.2702371">doi:10.1145/2702123.2702371</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/chi/WithanaPSN15.html">dblp:conf/chi/WithanaPSN15</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/iqez7hmkyrdf5c53ilqt4t3lh4">fatcat:iqez7hmkyrdf5c53ilqt4t3lh4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170210045906/http://ahlab.org/sites/default/files/2015-04_CHI-zSense_Full.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/77/29/77291ef9c13c98ca76038046120e3f7c2e0a4eac.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/2702123.2702371"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> acm.org </button> </a>
&laquo; Previous Showing results 1 &mdash; 15 out of 8,336 results