Journal of Chuxiong Normal University ›› 2022, Vol. 37 ›› Issue (3): 91-100.

• Mathematics & Computer Science • Previous Articles     Next Articles

Improved Human Action Recognition Method Based on Spatial Temporal Graph Convolutional Network

Song Wang   

  1. School of Management and Economics, Chuxiong Normal University, Chuxiong, Yunnan Province 675000
  • Received:2021-04-06 Published:2022-06-30

Abstract: In order to improve the accuracy of human action recognition using Spatial Temporal Graph Convolutional Networks (ST-GCN) and to better learn the motion features expressed by joint points and skeleton edges in bone data, the existing Spatial Temporal Graph Convolutional Networks (ST-GCN) human action recognition model is improved. Firstly, the directed graph is used to represent the information of joint points and bone edges and the relationship between them, and the joint position difference of adjacent frames is extracted as the motion information; secondly, the Two-stream Framework is used to learn the motion information and spatial information respectively to improve the recognition performance; finally, the attention weight matrix is used to make the topology of the graph adaptive and increase the recognition efficiency. The receptive field of nodes enables the network to learn the semantic information between the distal joints and better capture the motion features. The proposed method is tested on NTU-RGB+D dataset. The results show that the improved human action recognition method based on spatiotemporal convolution neural network achieves 96% accuracy on the data set, and the accuracy is improved compared with that of the existing ST-GCN model. This method can further promote the wide application of human action recognition technology in smart home, intelligent monitoring, security, human-computer interaction, content-based video retrieval, smart city development and other fields.

Key words: Human Action Recognition, Spatial Temporal Graph Convolutional Networks, Digraph Network, Attention Mechanism, Two-stream Framework

CLC Number: