Temporal action detection based on two-stream You Only Look Once network for elderly care service robot

Ke Wang, Xuejing Li, Jianhua Yang, Jun Wu, Ruifeng Li
<span title="2021-07-01">2021</span> <i title="SAGE Publications"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/dlo33cxbtffljkmrdnlsbqmlcq" style="color: black;">International Journal of Advanced Robotic Systems</a> </i> &nbsp;
Human action segmentation and recognition from the continuous untrimmed sensor data stream is a challenging issue known as temporal action detection. This article provides a two-stream You Only Look Once-based network method, which fuses video and skeleton streams captured by a Kinect sensor, and our data encoding method is used to turn the spatiotemporal temporal action detection into a one-dimensional object detection problem in constantly augmented feature space. The proposed approach
more &raquo; ... s spatial–temporal three-dimensional convolutional neural network features from video stream and view-invariant features from skeleton stream, respectively. Furthermore, these two streams are encoded into three-dimensional feature spaces, which are represented as red, green, and blue images for subsequent network input. We proposed the two-stream You Only Look Once-based networks which are capable of fusing video and skeleton information by using the processing pipeline to provide two fusion strategies, boxes-fusion or layers-fusion. We test the temporal action detection performance of two-stream You Only Look Once network based on our data set High-Speed Interplanetary Tug/Cocoon Vehicles-v1, which contains seven activities in the home environment and achieve a particularly high mean average precision. We also test our model on the public data set PKU-MMD that contains 51 activities, and our method also has a good performance on this data set. To prove that our method can work efficiently on robots, we transplanted it to the robotic platform and an online fall down detection experiment.
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1177/17298814211038342">doi:10.1177/17298814211038342</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/6awkuu3b4fdwta2cdjn7jdojlu">fatcat:6awkuu3b4fdwta2cdjn7jdojlu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20211104004821/https://journals.sagepub.com/doi/pdf/10.1177/17298814211038342" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/80/05/800556ba8c6a3b66672dda2cf6d6e1e285dd84f6.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1177/17298814211038342"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> sagepub.com </button> </a>