Fast action recognition using negative space features
Journal Publication ResearchOnline@JCUAbstract
Due to the number of potential applications and their inherent complexity, automatic capture and analysis of actions have become an active research area. In this paper, an implicit method for recognizing actions in a video is proposed. Existing implicit methods work on the regions of subjects, but our proposed system works on the surrounding regions, called negative spaces, of the subjects. Extracting features from negative spaces facilitates the system to extract simple, yet effective features for describing actions. These negative-space based features are robust to deformed actions, such as complex boundary variations, partial occlusions, non-rigid deformations and small shadows. Unlike other implicit methods, our method does not require dimensionality reduction, thereby significantly improving the processing time. Further, we propose a new method to detect cycles of different actions automatically. In the proposed system, first, the input image sequence is background segmented and shadows are eliminated from the segmented images. Next, motion based features are computed for the sequence. Then, the negative space based description of each pose is obtained and the action descriptor is formed by combining the pose descriptors. Nearest Neighbor classifier is applied to recognize the action of the input sequence. The proposed system was evaluated on both publically available action datasets and a new fish action dataset for comparison, and showed improvement in both its accuracy and processing time. Moreover, the proposed system showed very good accuracy for corrupted image sequences, particularly in the case of noisy segmentation, and lower frame rate. Further, it has achieved highest accuracy with lowest processing time compared with the state-of-art methods.
Journal
Expert Systems with Applications
Publication Name
N/A
Volume
41
ISBN/ISSN
1873-6793
Edition
N/A
Issue
2
Pages Count
14
Location
N/A
Publisher
Elsevier
Publisher Url
N/A
Publisher Location
N/A
Publish Date
N/A
Url
N/A
Date
N/A
EISSN
N/A
DOI
10.1016/j.eswa.2013.07.082