Research Article Open Access

PHASEY: A Contrastive Learning Approach for Enhanced Human Gait Phases Recognition

Urvashi1, Deepak Kumar2, Vinay Kukreja1 and Ayush Dogra3
  • 1 Centre for Research Impact & Outcome, Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura, Punjab, India
  • 2 Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura, Punjab, India
  • 3 Centre for Research Impact & Outcome, Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura, Punjab, India

Abstract

Human gait has gained much attention in behavioral biometrics as it possesses unique and distinctive characteristics. Gait phases, which describe the different patterns of human walking, are significant for the analysis and understanding of movement in an individual. Hence, the identification of gait phases is important for the accurate determination and interpretation of walking patterns, ranging from healthcare and security to rehabilitation. This study aims to propose an efficient model, called Precision Human Gait Activity Segmentation for Gait Phases Recognition using YOLOv9 (PHASEY), and a contrastive learning method that localizes and recognizes the gait stance phase and swing phase more efficiently and correctly. The proposed PHASEY model localizes the walking Gait phase patterns and distinguishes movement patterns in each of the phases. It uses CSPDarknet 53 as its backbone, which is further trained to identify swing and stance gait phases using silhouette images. The PHASEY model has three prime components- backbone, neck, and head. There is feature extraction from the backbone, then, visualization of those features through Grad-CAM within the neck is provided. Lastly, the head unit is accountable for the gait phase classification. By training the CSPDarknet 53 in the PHASEY model, the accuracy, as well as Intersection over Union (IoU), and inference time were calculated with different epochs. The experimental results show that the model attained the highest accuracy of 0.9907 at the epoch value 50. After comparing the YOLO models, it was evident that YOLOv9 achieved the highest accuracy of 94.8%, with a Precision value of 93.1%, Recall 91.9% and IoU with 87.8%. By utilizing this real-time object detection model for determining the phases of the gait cycle, the approach demonstrated exceptional performance in both localization and classification across different subjects.

Journal of Computer Science
Volume 21 No. 8, 2025, 1795-1810

DOI: https://doi.org/10.3844/jcssp.2025.1795.1810

Submitted On: 26 September 2024 Published On: 16 September 2025

How to Cite: Urvashi, ., Kumar, D., Kukreja, V. & Dogra, A. (2025). PHASEY: A Contrastive Learning Approach for Enhanced Human Gait Phases Recognition. Journal of Computer Science, 21(8), 1795-1810. https://doi.org/10.3844/jcssp.2025.1795.1810

  • 360 Views
  • 42 Downloads
  • 0 Citations

Download

Keywords

  • Swing Gait Phase
  • Stance Gait Phase
  • Pretrained Network
  • You Only Look Once (YOLO)
  • Pretrained Network