A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
[article]
2019
arXiv
pre-print
We present a novel classifier network called STEP, to classify perceived human emotion from gaits, based on a Spatial Temporal Graph Convolutional Network (ST-GCN) architecture. Given an RGB video of an individual walking, our formulation implicitly exploits the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. We use hundreds of annotated real-world gait videos and augment them with thousands of annotated synthetic gaits
arXiv:1910.12906v1
fatcat:lkmwd5kidfcmfbpfbde6rrkq5u