Filters








2,939 Hits in 5.8 sec

Making Sense of Vision and Touch: Self-Supervised Learning of Multimodal Representations for Contact-Rich Tasks [article]

Michelle A. Lee, Yuke Zhu, Krishnan Srinivasan, Parth Shah, Silvio Savarese, Li Fei-Fei, Animesh Garg, Jeannette Bohg
2019 arXiv   pre-print
We use self-supervision to learn a compact and multimodal representation of our sensory inputs, which can then be used to improve the sample efficiency of our policy learning.  ...  Contact-rich manipulation tasks in unstructured environments often require both haptic and visual feedback.  ...  INTRODUCTION Even in routine tasks such as inserting a car key into the ignition, humans effortlessly combine the senses of vision and touch to complete the task.  ... 
arXiv:1810.10191v2 fatcat:uj3cpmgk7rdw7csyc52blkd6tq

Touch-based Curiosity for Sparse-Reward Tasks [article]

Sai Rajeswar, Cyril Ibrahim, Nitin Surya, Florian Golemo, David Vazquez, Aaron Courville, Pedro O. Pinheiro
2021 arXiv   pre-print
Robots in many real-world settings have access to force/torque sensors in their gripper and tactile sensing is often necessary in tasks that involve contact-rich motion.  ...  We test our approach on a range of touch-intensive robot arm tasks (e.g. pushing objects, opening doors), which we also release as part of this work.  ...  Acknowledgements We would like to thank Deepak Pathak, Glen Berseth, Edward Smith and Krishna Murthy for valuable feedback and insightful comments.  ... 
arXiv:2104.00442v2 fatcat:hoaoizrb3rcvpflyrjpv3qpg2y

Does Touching Real Objects Affect Learning?

Magdalena Novak, Stephan Schwan
2020 Educational Psychology Review  
Therefore, the haptic sense is crucial for interacting with the environment and with each other, forming an integral part of our multimodal system (Minogue and Jones 2006; Smith and Educational Psychology  ...  Based on theories of multimedia learning, the present study investigated whether the haptic sense serves as an additional channel to enhance the learning experience and learning outcomes.  ...  Thus, in cases of fully compatible inputs from vision and haptics, as when an object is looked at while touching it, a unified, multimodal representation is built (Hollins 2010 ).  ... 
doi:10.1007/s10648-020-09551-z fatcat:4n7kcspqdzb3lb7fvluxpwhjpy

Guided by touch

Ricky Jacob, Peter Mooney, Adam C. Winstanley
2011 Proceedings of the 1st international workshop on Mobile location-based service - MLBS '11  
Haptics is a feedback technology that takes advantage of the human sense of touch by applying forces, vibrations, and/or motions to a haptic-enabled user device such as a mobile phone.  ...  In this paper we describe four haptic feedback-based prototypes for pedestrian navigation.  ...  Acknowledgements Research in this paper is carried out as part of the Strategic Research Cluster grant (07/SRC/I1168) funded by Science Foundation Ireland under the National Development Plan.  ... 
doi:10.1145/2025876.2025881 fatcat:eywvtld7kzawpohnf7vthz4sty

The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes? [article]

Roberto Calandra, Andrew Owens, Manu Upadhyaya, Wenzhen Yuan, Justin Lin, Edward H. Adelson, Sergey Levine
2017 arXiv   pre-print
In this work, we investigate the question of whether touch sensing aids in predicting grasp outcomes within a multimodal sensing framework that combines vision and touch.  ...  Deducing whether a particular grasp will be successful from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through touch sensing provides an appealing  ...  Acknowledgments We thank Chris Myers, Dan Chapman and the CITRIS invention lab for their support with 3D printing, and Siyuan Dong for the technical support with the GelSights.  ... 
arXiv:1710.05512v1 fatcat:mtlrs22zkre5djopff4gqqbrcy

More Than a Feeling: Learning to Grasp and Regrasp Using Vision and Touch

Roberto Calandra, Andrew Owens, Dinesh Jayaraman, Justin Lin, Wenzhen Yuan, Jitendra Malik, Edward H. Adelson, Sergey Levine
2018 IEEE Robotics and Automation Letters  
For humans, the process of grasping an object relies heavily on rich tactile feedback.  ...  This model -- a deep, multimodal convolutional network -- predicts the outcome of a candidate grasp adjustment, and then executes a grasp by iteratively selecting the most promising actions.  ...  Our method uses rich touch sensing that is aware of texture and surface shape, simultaneously incorporates multiple modalities, and can flexibly accommodate additional constraints, such as minimum-force  ... 
doi:10.1109/lra.2018.2852779 dblp:journals/ral/CalandraOJLYMAL18 fatcat:wj2ikfy4bzdtfe33pyves7yydy

Tactile Hypersensitivity and "Overwhelming Subjectivity" in the Touch Experience of People With Congenital Deafblindness: Implications for a Touch-Based Pedagogy

Kirsten Costain
2020 Frontiers in Education  
Helping to overcome the deprivation and isolation caused by the overwhelming activation of tactual subjectivity that occurs in touch hyper-sensitivity is an important goal for the partners of people with  ...  the support of people with CDB and other forms of multiple disability and touch hyper-sensitivity.  ...  ACKNOWLEDGMENTS I wish to thank my employer, Statped, Norway for funding support for the production of this article.  ... 
doi:10.3389/feduc.2020.582808 fatcat:2lvt7bfcgzfntcc6tdszdhh7cq

Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence

Annett Schirmer, Ralph Adolphs
2017 Trends in Cognitive Sciences  
Regions typically more active for emotional as compared to neutral stimuli are marked in green, blue, and red for voice, face, and touch, respectively.  ...  Regions typically more active for emotional multimodal as compared to unimodal stimulation are marked in beige.  ...  Higher-level representations can feed back and modulate lower-level representations. Comparing the processing characteristics of vision, audition, and touch for perceiving emotions.  ... 
doi:10.1016/j.tics.2017.01.001 pmid:28173998 pmcid:PMC5334135 fatcat:dbnemcccyfbjbljccyv6l2dx3u

Designing Media for Visually-Impaired Users of Refreshable Touch Displays: Possibilities and Pitfalls

Sile O'Modhrain, Nicholas A. Giudice, John A. Gardner, Gordon E. Legge
2015 IEEE Transactions on Haptics  
and render images for both hard-copy and technology-mediated presentation of Braille and tangible graphics. !  ...  The paper considers the influence of human factors on effectiveness of presentation as well as the strengths and weaknesses of tactile, vibrotactile, static pins, haptic, force feedback, and multimodal  ...  Whatever the specific technology may be, it makes sense for future refreshable touch displays to build on this multimodal, multi-use model of graphical access.  ... 
doi:10.1109/toh.2015.2466231 pmid:26276998 fatcat:g2gwz6xgtvhshnunyv6lravpv4

Grounding language in the neglected senses of touch, taste, and smell

Laura J. Speed, Asifa Majid
2019 Cognitive Neuropsychology  
Comprehending language related to these senses may instead rely on simulation of emotion, as well as crossmodal simulation of the "higher" senses of vision and audition.  ...  Such theories have focused predominantly on the dominant senses of sight and hearing.  ...  Acknowledgements We thank Shirley-Ann Rueschemeyer, Bodo Winter, and one anonymous reviewer for comments on the manuscript.  ... 
doi:10.1080/02643294.2019.1623188 pmid:31230566 fatcat:jj2cotguhffzfbwqayqi2sezru

Touch and Go — Designing Haptic Feedback for a Hand-Held Mobile Device

S O'Modhrain
2004 BT technology journal  
In this paper I will propose that, for the mobile user negotiating these multiple frames of reference for their actions, a better understanding of the senses of touch, of the body's motion and its sense  ...  The coincidence of connectedness, awareness and richly multimodal input and output capabilities brings into the hand a device capable of supporting an entirely new class of haptic or touchbased interactions  ...  of this task, and then to select the technical solutions that will make it possible to generate the appropriate touch effects.  ... 
doi:10.1023/b:bttj.0000047592.21315.ce fatcat:47efyowwqjad5grxowafutuu7u

Analyzing visually impaired people's touch gestures on smartphones

Maria Claudia Buzzi, Marina Buzzi, Barbara Leporini, Amaury Trujillo
2016 Multimedia tools and applications  
We then examined their touch-based gesture preferences in terms of number of strokes, multi-touch, and shape angle, as well as their execution in geometric, kinematic and relative terms.  ...  To that end, we recruited 36 visually impaired participants and divided them into two main groups of low-vision and blind people respectively.  ...  Nonetheless, learning new gestures still remains a challenging task for blind people.  ... 
doi:10.1007/s11042-016-3594-9 fatcat:dbnrxrkerjcv3lepyo5eiaves4

Mediated social touch: a review of current research and future directions

Antal Haans, Wijnand IJsselsteijn
2005 Virtual Reality  
Based on social psychological literature on touch, communication, and the effects of media, we assess the current research and design efforts and propose future directions for the field of mediated social  ...  Whereas current communication media rely predominately on vision and hearing, mediated social touch allows people to touch each other over a distance by means of haptic feedback technology.  ...  This research was supported by the JF Schouten School for User-System Interaction Research at Eindhoven University of Technology, Eindhoven, The Netherlands  ... 
doi:10.1007/s10055-005-0014-2 fatcat:p4rknajg4vek7prjiqveugecim

Navigating by Touch: Haptic Monte Carlo Localization via Geometric Sensing and Terrain Classification [article]

Russell Buchanan, Jakub Bednarek, Marco Camurri, Michał R. Nowicki, Krzysztof Walas, Maurice Fallon
2021 arXiv   pre-print
Then, a Monte Carlo-based estimator fuses this terrain class probability with the geometric information of the foot contact points.  ...  Legged robot navigation in extreme environments can hinder the use of cameras and laser scanners due to darkness, air obfuscation or sensor damage.  ...  Acknowledgments This research has been conducted as part of the ANYbotics research community.  ... 
arXiv:2108.08015v1 fatcat:mazn5kg3qnglxjj5fvludpdfpe

Digitally-mediated parent–baby touch and the formation of subjectivities

Carey Jewitt, Kerstin Leder Mackley, Sara Price
2021 Visual Communication  
This article examines how the use of emergent smart baby monitors re-mediates parent–baby touch, notions of connection, parental sensing and the interpretation of babies' bodies, and contributes to the  ...  The authors discuss multimodal discourses pertinent to the shaping of parent–baby touch practices including: rationality and efficiency; individualism, autonomy and freedom; and self-improvement and empowerment  ...  This study prompts us to ask how the multimodal and sensorial representational shifts facilitated by Owlet might play out for the future of digital touch technologies, parental touch practices and acuity  ... 
doi:10.1177/1470357220961412 fatcat:uqkdqjr6tveunm4la5kfqymu7m
« Previous Showing results 1 — 15 out of 2,939 results