Filters








5,246 Hits in 3.2 sec

Worker Activity Recognition in Smart Manufacturing Using IMU and sEMG Signals with Convolutional Neural Networks

Wenjin Tao, Ze-Hao Lai, Ming C. Leu, Zhaozheng Yin
2018 Procedia Manufacturing  
In this paper, we propose a method for activity recognition using Inertial Measurement Unit (IMU) and surface electromyography (sEMG) signals obtained from a Myo armband.  ...  In this paper, we propose a method for activity recognition using Inertial Measurement Unit (IMU) and surface electromyography (sEMG) signals obtained from a Myo armband.  ...  It demonstrates that the activity images from the IMU signals provide more discriminative features for activity recognition.  ... 
doi:10.1016/j.promfg.2018.07.152 fatcat:2cosk6kp2vcetdvoyrdrfpd2k4

Are You Wearing a Mask? Detecting If a Person Wears a Mask Using a Wristband

Constantino Msigwa, Seungwoo Baek, Denis Bernard, Jaeseok Yun
2022 Sensors  
images collected from 25 subjects.  ...  In this paper, we propose an activity recognition method based on a wristband equipped with an IR array and inertial measurement unit (IMU) to detect individual compliance with codes of personal hygiene  ...  Data Availability Statement: Not applicable. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s22051745 pmid:35270893 pmcid:PMC8915108 fatcat:fupivnzqybbujenr7rmylnfvqu

Deep ConvLSTM Network with Dataset Resampling for Upper Body Activity Recognition Using Minimal Number of IMU Sensors

Xiang Yang Lim, Kok Beng Gan, Noor Azah Abd Aziz
2021 Applied Sciences  
Human activity recognition (HAR) is the study of the identification of specific human movement and action based on images, accelerometer data and inertia measurement unit (IMU) sensors.  ...  The imbalanced class distribution is another challenge to the recognition of human activity in real-life.  ...  Activity 4 Open then close door 1  ... 
doi:10.3390/app11083543 fatcat:zaoldzbhmnddrchfzgnsdqw37u

IMU-Based Movement Trajectory Heatmaps for Human Activity Recognition

Orhan Konak, Pit Wegner, Bert Arnrich
2020 Sensors  
Recent trends in ubiquitous computing have led to a proliferation of studies that focus on human activity recognition (HAR) utilizing inertial sensor data that consist of acceleration, orientation and  ...  In image classification, this limitation has been mitigated by powerful oversampling techniques such as data augmentation.  ...  [14] proposed a method for fine-grained hand activity recognition based on acceleration data obtained from a commodity-based smartwatch.  ... 
doi:10.3390/s20247179 pmid:33333839 fatcat:yy4qkuqyujfybpb5kw3dsnfru4

Multi-Modal Recognition of Worker Activity for Human-Centered Intelligent Manufacturing [article]

Wenjin Tao, Ming C. Leu, Zhaozheng Yin
2019 arXiv   pre-print
In this paper, we propose a novel multi-modal approach for worker activity recognition by leveraging information from different sensors and in different modalities.  ...  For the IMU signals, we design two novel feature transform mechanisms, in both frequency and spatial domains, to assemble the captured IMU signals as images, which allow using convolutional neural networks  ...  However, visual-based recognition suffers from the occlusion issue, which affects the recognition accuracy.  ... 
arXiv:1908.07519v1 fatcat:g5ttspvjpfdr7cenhoctnbnxui

A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition Using a Wearable Hybrid Sensor System

Haibin Yu, Guoxiong Pan, Mian Pan, Chong Li, Wenyan Jia, Li Zhang, Mingui Sun
2019 Sensors  
The motion sensor data are used solely for activity classification according to motion state, while the photo stream is used for further specific activity recognition in the motion state groups.  ...  Long short-term memory (LSTM) and a convolutional neural network are used to perform egocentric ADL recognition based on motion sensor data and photo streaming in different layers, respectively.  ...  The authors would like to acknowledge all the participants for their significant contributions to this research study, as well as Sibo Song et al. for providing the online public Multimodal Egocentric Activity  ... 
doi:10.3390/s19030546 fatcat:im2clnjovzfvnng4ltq5qgeqcu

IMUTube: Automatic Extraction of Virtual on-body Accelerometry from Video for Human Activity Recognition [article]

Hyeokhyen Kwon, Catherine Tong, Harish Haresamudram, Yan Gao, Gregory D. Abowd, Nicholas D. Lane, Thomas Ploetz
2020 arXiv   pre-print
The lack of large-scale, labeled data sets impedes progress in developing robust and generalized predictive models for on-body sensor-based human activity recognition (HAR).  ...  streams of IMU data.  ...  EXTRACTING VIRTUAL IMU DATA FROM VIDEOS The key idea of our work is to replace the conventional data collection procedure that is typically employed for the development of sensor-based human activity recognition  ... 
arXiv:2006.05675v2 fatcat:o56v3cs2hbaidpawewkgbj2w6m

Complex Deep Neural Networks from Large Scale Virtual IMU Data for Effective Human Activity Recognition Using Wearables

Hyeokhyen Kwon, Gregory D. Abowd, Thomas Plötz
2021 Sensors  
Supervised training of human activity recognition (HAR) systems based on body-worn inertial measurement units (IMUs) is often constrained by the typically rather small amounts of labeled sample data.  ...  We have collected around 41 h of virtual IMU data using IMUTube from exercise videos available from YouTube.  ...  The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results  ... 
doi:10.3390/s21248337 pmid:34960431 pmcid:PMC8707382 fatcat:ex27jvnfxnbq3g7b2wfdqmmy2a

Gait Phase Recognition Using Deep Convolutional Neural Network with Inertial Measurement Units

Binbin Su, Christian Smith, Elena Gutierrez Farewik
2020 Biosensors  
User kinematics, measured from inertial measurement unit (IMU) output, can be considered as an 'image' since it exhibits some local 'spatial' pattern when the sensor data is arranged in sequence.  ...  We propose a specialized DCNN to distinguish five phases in a gait cycle, based on IMU data and classified with foot switch information.  ...  to identify human motion and activity from the signal of wearable sensors such as IMUs [31] [32] [33] [34] .  ... 
doi:10.3390/bios10090109 pmid:32867277 fatcat:mris2al26fgune3273ode4dudu

How You Move Your Head Tells What You Do: Self-supervised Video Representation Learning with Egocentric Cameras and IMU Sensors [article]

Satoshi Tsutsui, Ruta Desai, Karl Ridgeway
2021 arXiv   pre-print
We are particularly interested in learning egocentric video representations benefiting from the head-motion generated by users' daily activities, which can be easily obtained from IMU sensors embedded  ...  Understanding users' activities from head-mounted cameras is a fundamental task for Augmented and Virtual Reality (AR/VR) applications.  ...  That is, given a pair of randomly augmented images, their representations are encouraged to be similar if they are from the same image, and not if from different images.  ... 
arXiv:2110.01680v1 fatcat:t77vyi52x5ev7ap4yyvl5rblsi

A Dual-channel Artificial Neural Network Decision Fusion Framework Incorporated with Deep Learning of Inertial Measurement Unit Sensor-based Spectrum Images for Hand Gesture Intention Cognition

Ing-Jr Ding, Ya-Cheng Juang, Bing-Tsan Lin
2022 Journal of Imaging Science and Technology  
The proposed dual-channel ANN decision fusion framework contains one ANN recognition channel with inputs of "6-axis IMU raw data" and the other ANN recognition channel with inputs of "IMU spectrogram image  ...  We present a dual-channel artificial neural network (ANN) recognition decision hybridization scheme incorporated with deep leaning of IMU-based spectrogram images for cognition of several common hand gesture  ...  According to variances of the employed sensor and the acquired data, hand gesture recognition can be primarily categorized into RGB image-based [12] [13] [14] , 3-dimensional (3-D) space data-based [  ... 
doi:10.2352/j.imagingsci.technol.2022.66.4.040403 fatcat:v6z7lnyrrrfa5ckov4zppcfbei

Electromyogram in Cigarette Smoking Activity Recognition

Volkan Senyurek, Masudul Imtiaz, Prajakta Belsare, Stephen Tiffany, Edward Sazonov
2021 Signals  
unit (IMU) to augment recognition performance.  ...  The model was developed and evaluated with leave-one-subject-out (LOSO) cross-validation on a dataset from 16 subjects who performed ten activities of daily living including smoking.  ...  We also evaluated the combination of sEMG-based muscle activity and IMU-based motion for cigarette smoking recognition.  ... 
doi:10.3390/signals2010008 fatcat:k4z7hmwyorejvhmhl37mzsnlhi

Attention-Based Sensor Fusion for Human Activity Recognition Using IMU Signals [article]

Wenjin Tao, Haodong Chen, Md Moniruzzaman, Ming C. Leu, Zhaozheng Yi, Ruwen Qin
2021 arXiv   pre-print
In this paper, we propose a novel attention-based approach to human activity recognition using multiple IMU sensors worn at different body locations.  ...  Human Activity Recognition (HAR) using wearable devices such as smart watches embedded with Inertial Measurement Unit (IMU) sensors has various applications relevant to our daily life, such as workout  ...  Related Work The critical factor attributed to the success of IMU-based activity recognition is to seek an effective representation of the time-series IMU signals.  ... 
arXiv:2112.11224v1 fatcat:it42nxqxorchdkruhjer7cutoe

Real Time Human Activity Recognition Using Acceleration and First-Person Camera data

Christos Androutsos, Nikolaos S. Tachos, Evanthia E. Tripoliti, Ioannis Karatzanis, Dimitris Manousos, Manolis Tsiknakis, Dimitrios I. Fotiadis
2021 Zenodo  
The aim of this work is to present an automated method, working in real time, for human activity recognition based on acceleration and first-person camera data.  ...  A LongShort-Term-Memory (LSTM) model has been built for recognizing locomotive activities (i.e. walking, sitting, standing, going upstairs, going downstairs) from acceleration data, while a ResNet model  ...  The developed models utilize data streams from an IMU sensor and image frames from an FPC and are based on an LSTM architecture and a refined CNN model.  ... 
doi:10.5281/zenodo.5704879 fatcat:z3aq673anjgd7n56are7gghoau

Translating Videos into Synthetic Training Data for Wearable Sensor-Based Activity Recognition Systems Using Residual Deep Convolutional Networks

Vitor Fortes Rey, Kamalveer Kaur Garewal, Paul Lukowicz
2021 Applied Sciences  
Human activity recognition (HAR) using wearable sensors has benefited much less from recent advances in Deep Learning than fields such as computer vision and natural language processing.  ...  Thus, for example, ImageNet has images for around 100,000 categories (based on WordNet) with on average 1000 images per category (therefore up to 100,000,000 samples).  ...  Only real data from IMU Only simulated IMU data from our videos Only IMU data simulated from YouTube Only real data from IMU Only simulated IMU data from our videos Only IMU data simulated from YouTube  ... 
doi:10.3390/app11073094 doaj:338eff362f454ee7896d2128d2f4e43e fatcat:aw63otlfgjgjvgpq2wvogxh4ha
« Previous Showing results 1 — 15 out of 5,246 results