HUMAN ACTIVITY DETECTION AND ACTION RECOGNITION IN VIDEOS USING CONVOLUTIONAL NEURAL NETWORKS

Authors

  • Jagadeesh Basavaiah VVCE
  • Chandrashekar Mohan Patil Department of Electronics and Communication Engineering, Vidyavardhaka College of Engineering, India

DOI:

https://doi.org/10.32890/jict2020.19.2.1

Keywords:

Action recognition, convolutional neural network, Gaussian Mixture Model, optical flow, SIFT feature extraction

Abstract

Human activity recognition from video scenes has become a significant area of research in the field of computer vision applications. Action recognition is one of the most challenging problems in the area of video analysis and it finds applications in human-computer interaction, anomalous activity detection, crowd monitoring and patient monitoring. Several approaches have been presented for human activity recognition using machine learning techniques. The main aim of this work is to detect and track human activity, and classify actions for two publicly available video databases. In this work, a novel approach of feature extraction from video sequence by combining Scale Invariant Feature Transform and optical flow computation are used where shape, gradient and orientation features are also incorporated for robust feature formulation. Tracking of human activity in the video is implemented using the Gaussian Mixture Model. Convolutional Neural Network based classification approach is used for database training and testing purposes. The activity recognition performance is evaluated for two public datasets namely Weizmann dataset and Kungliga Tekniska Hogskolan dataset with action recognition accuracy of 98.43% and 94.96%, respectively. Experimental and comparative studies have shown that the proposed approach outperformed state-of the art techniques.

Metrics

Metrics Loading ...

Additional Files

Published

31-03-2020

How to Cite

Basavaiah, J., & Mohan Patil, C. (2020). HUMAN ACTIVITY DETECTION AND ACTION RECOGNITION IN VIDEOS USING CONVOLUTIONAL NEURAL NETWORKS. Journal of Information and Communication Technology, 19(2), 157–183. https://doi.org/10.32890/jict2020.19.2.1