TY - JOUR
T1 - Appearance-based passenger counting in cluttered scenes with lateral movement compensation
AU - Sutopo, Ricky
AU - Lim, Joanne Mun Yee
AU - Baskaran, Vishnu Monn
AU - Wong, Kok Sheik
AU - Tistarelli, Massimo
AU - Liau, Heng Fui
N1 - Funding Information:
This work was supported by the School of Engineering and School of Information Technology, Monash University Malaysia, by Intel Technology Sdn Bhd, by Grants from the MSCA-Rise European Project IDENTITY, the Italian Ministry Research projects PRIN COSMOS and SPADA, and by a special Grant from the University of Sassari “fondo di Ateneo per la Ricerca 2019 e 2020”.
Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer-Verlag London Ltd. part of Springer Nature.
Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.
PY - 2021/8
Y1 - 2021/8
N2 - Autonomous passenger counting in public transportation represents an integral part of an intelligent transportation system, as it provides vital information to improve the efficiency and resource management of a public transportation network. However, counting passengers in highly crowded scenes is a challenging task due to their random movement, diverse appearance settings and inter-object occlusions. Furthermore, state-of-the-art methods in this domain rely heavily on additional custom cameras or sensors instead of existing onboard surveillance cameras, which consequently limits the feasibility of such systems for large-scale deployment. Hence, this paper puts forward an enhanced appearance descriptor with lateral movement compensation, which addresses the difficulty in counting passengers bidirectionally in cluttered scenes. We first construct a head re-identification dataset, which is used to train an appearance descriptor. This dataset addresses the absence of a person re-identification dataset, which in turn allows for accurate tracking of passengers in cluttered scenes. Then, a novel technique of applying a fedora counting line is introduced to count the number of passengers entering and exiting a bus. This technique compensates the impact of passengers’ lateral movement, which crucially increases the accuracy of bidirectional passenger counting using onboard bus surveillance cameras. In addition, a real-time implementation of the proposed method, which includes the integration of DeepStream and fedora counting line, is also presented. Experimental results on a challenging test dataset demonstrate that the proposed method outperforms benchmarked techniques with an average counting accuracy of 93.21% for entering and 96.10% for exiting public buses. Furthermore, the proposed system achieves this accuracy at an average frame rate of 16 frames per second, which represents a practical solution to a real-time application.
AB - Autonomous passenger counting in public transportation represents an integral part of an intelligent transportation system, as it provides vital information to improve the efficiency and resource management of a public transportation network. However, counting passengers in highly crowded scenes is a challenging task due to their random movement, diverse appearance settings and inter-object occlusions. Furthermore, state-of-the-art methods in this domain rely heavily on additional custom cameras or sensors instead of existing onboard surveillance cameras, which consequently limits the feasibility of such systems for large-scale deployment. Hence, this paper puts forward an enhanced appearance descriptor with lateral movement compensation, which addresses the difficulty in counting passengers bidirectionally in cluttered scenes. We first construct a head re-identification dataset, which is used to train an appearance descriptor. This dataset addresses the absence of a person re-identification dataset, which in turn allows for accurate tracking of passengers in cluttered scenes. Then, a novel technique of applying a fedora counting line is introduced to count the number of passengers entering and exiting a bus. This technique compensates the impact of passengers’ lateral movement, which crucially increases the accuracy of bidirectional passenger counting using onboard bus surveillance cameras. In addition, a real-time implementation of the proposed method, which includes the integration of DeepStream and fedora counting line, is also presented. Experimental results on a challenging test dataset demonstrate that the proposed method outperforms benchmarked techniques with an average counting accuracy of 93.21% for entering and 96.10% for exiting public buses. Furthermore, the proposed system achieves this accuracy at an average frame rate of 16 frames per second, which represents a practical solution to a real-time application.
KW - Cluttered scenes
KW - Deep learning
KW - Intelligent transportation system
KW - People counting
KW - Person re-identification
UR - http://www.scopus.com/inward/record.url?scp=85100638800&partnerID=8YFLogxK
U2 - 10.1007/s00521-021-05760-x
DO - 10.1007/s00521-021-05760-x
M3 - Article
AN - SCOPUS:85100638800
SN - 0941-0643
VL - 33
SP - 9891
EP - 9912
JO - Neural Computing and Applications
JF - Neural Computing and Applications
IS - 16
ER -