150
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Human gait recognition using firefly template segmentation

, & ORCID Icon
Pages 565-575 | Received 19 May 2021, Accepted 27 Nov 2021, Published online: 22 Dec 2021
 

ABSTRACT

Gait recognition is one of the advanced behavioural biometric technology, which aims to distinguish people by their walking style. The unique advantage of gait recognition is its capability to catch gait at a distance without the prior consent of the subject. In the gait recognition process, the features of human motion are automatically extracted and later used to authenticate the identity of the person in motion. The template-based model-free gait recognition method offers an optimal solution for gait recognition through gait energy image (GEI) and gradient gait energy image (GGEI), which increases the performance of gait recognition against covariates like normal walking conditions, bag, and coat. In this paper, we propose a novel firefly template segmentation (FTS) method, which employs the firefly algorithm to accomplish the boundary selection process. The principal component analysis (PCA) technique is used for dimensional reduction and the multiple discriminant analysis (MDA) technique applies to achieve better class separability. The proposed work is tested on a publicly available CASIA-B dataset and the experimental results show excellent performance in comparison with other gait recognition methods reported in the literature.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

Notes on contributors

Sankara Rao Palla

Sankara Rao Palla, received his B.Tech degree in SISTAM Engg. College, Srikakulam, in 2009 and M.Tech in AITAM Engg. College, Srikakulam, in 2012. He is now perusing his Ph.D. at GIET university, Gunupur and also working as an Assistant professor in Raghu Engineering College, Visakhapatnam. His current research interests include Computer Vision, pattern recognition, and machine learning.

Gupteswar Sahu

Dr. G. Sahu is currently an Associate Professor in the Department of Electronics and Communication Engineering, Raghu Engineering College (A), Visakhapatnam, India. He received the B. Tech degree in Electronics and Communication Engineering from MITS Engineering College, Odisha, India, in 2003. The M. Tech degree in signal processing from IIT Guwahati, Guwahati, India, in 2008, and the Ph. D degree in Electronics and Communication Engineering from NIT Jamshedpur, Jamshedpur, India, in 2018. His research interests are in image processing, time-frequency analysis of non-stationary signals, applications of soft computing in electrical and electronics engineering and computer simulation techniques.

Priyadarsan Parida

Dr. Priyadarsan Parida, is an Associate Professor in the deparment of Electronics and Communication Engineering at GIET University, Gunupur, India. He has completed his B.Tech and M.Tech in Electronics Engineering from Biju Pattnaik University of Technology, Odisha. He has obtained his Ph.D. in Electronics and Telecommunication Engineering from Veer Surenda Sai University of Technology (VSSUT), Burla. His research interests include computer vision and its application to various fields like biomedical image analysis, biometrics and video analytics.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.