SmartCountplus - Towards Automated Counting and Modelling of Non-Motorised Traffic with a Stand-Alone Sensor Device

Norbert Brändle, Ahmed Nabil Belbachir (Speaker), Stephan Schraml

Research output: Chapter in Book or Conference ProceedingsConference Proceedings with Oral Presentationpeer-review

Abstract

We introduce a novel visual counting device being able to automatically discriminate between participants of non-motorised traffic (pedestrians, bicyclists). The sensor elements (pixels) respond to relative light intensity changes, thus avoiding conventional imaging and privacy issues usually raised by the public when it comes to visual surveillance. Three-dimensional depth information is computed with the stereo principle, and the set of light intensity change events is grouped together with a clustering algorithm to discriminate between moving objects. A classification algorithm based on descriptive features then identifies indivual participants of non-motorised traffic. A preliminary evaluation on a dataset with 128 passages shows a classification rate of 92% for riding cyclists and 100% for pedestrians for 2+1 classification, and 43-96% for 4+1 classification distinguishing between riding cyclists, pedestrians, walking cyclists, umbrellas and other objects.
Original languageEnglish
Title of host publicationReal Corp 2010
Pages1261-1266
Number of pages6
Publication statusPublished - 2010
Event15th International Conference on urban Planning and Regional Development in the Information Society (REAL CORP 2010) -
Duration: 9 Sept 201012 Sept 2010

Conference

Conference15th International Conference on urban Planning and Regional Development in the Information Society (REAL CORP 2010)
Period9/09/1012/09/10

Research Field

  • Former Research Field - Mobility Systems
  • Former Research Field - Digital Safety and Security

Fingerprint

Dive into the research topics of 'SmartCountplus - Towards Automated Counting and Modelling of Non-Motorised Traffic with a Stand-Alone Sensor Device'. Together they form a unique fingerprint.

Cite this