Please use this identifier to cite or link to this item:
|Title:||An improved real-time method for counting people in crowded scenes based on a statistical approach||Authors:||Riachy, Shirine
|Affiliations:||Department of Computer Engineering
Department of Mathematics
|Subjects:||Detectors||Issue Date:||2015||Publisher:||IEEE||Part of:||11th International Conference on Informatics in Control, Automation and Robotics (ICINCO)||Start page:||203||End page:||212||Conference:||International Conference on Informatics in Control, Automation and Robotics (ICINCO) (11th : 1-3 Sept. 2014 : Vienna, Austria)||Abstract:||
In this paper, we present a real-time method for counting people in crowded conditions using an indirect/statistical approach. Our method is based on an algorithm by Albiol et al. that won the PETS 2009 contest on people counting. We employ a scale-invariant interest point detector from the state of the art coined SURF (Speeded-Up Robust Features), and we exploit motion information to retain only interest points belonging to moving people. Direct proportionality is then assumed between the number of remaining SURF points and the number of people. Our technique was first tested on three video sequences from the PETS dataset. Results showed an improvement over Albiol's in all the three cases. It was then tested on our set of video sequences taken under various conditions. Despite the complexity of the scenes, results were very reasonable with a mean relative error ranging from 9.36% to 17.06% and a mean absolute error ranging from 1.13 to 3.33. Testing this method on a new dataset proved its speed and accuracy under many shooting scenarios, especially in crowded conditions where the averaging process reduces the variations in the number of detected SURF points per person.
|URI:||https://scholarhub.balamand.edu.lb/handle/uob/623||Ezproxy URL:||Link to full text||Type:||Conference Paper|
|Appears in Collections:||Department of Computer Engineering|
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.