SmartCountplus - Towards Automated Counting and Modelling of Non-Motorised Traffic with a Stand-Alone Sensor Device

Norbert Brändle, Ahmed Nabil Belbachir (Vortragende:r), Stephan Schraml

Publikation: Beitrag in Buch oder TagungsbandVortrag mit Beitrag in TagungsbandBegutachtung

Abstract

We introduce a novel visual counting device being able to automatically discriminate between participants of non-motorised traffic (pedestrians, bicyclists). The sensor elements (pixels) respond to relative light intensity changes, thus avoiding conventional imaging and privacy issues usually raised by the public when it comes to visual surveillance. Three-dimensional depth information is computed with the stereo principle, and the set of light intensity change events is grouped together with a clustering algorithm to discriminate between moving objects. A classification algorithm based on descriptive features then identifies indivual participants of non-motorised traffic. A preliminary evaluation on a dataset with 128 passages shows a classification rate of 92% for riding cyclists and 100% for pedestrians for 2+1 classification, and 43-96% for 4+1 classification distinguishing between riding cyclists, pedestrians, walking cyclists, umbrellas and other objects.
OriginalspracheEnglisch
TitelReal Corp 2010
Seiten1261-1266
Seitenumfang6
PublikationsstatusVeröffentlicht - 2010
Veranstaltung15th International Conference on urban Planning and Regional Development in the Information Society (REAL CORP 2010) -
Dauer: 9 Sept. 201012 Sept. 2010

Konferenz

Konferenz15th International Conference on urban Planning and Regional Development in the Information Society (REAL CORP 2010)
Zeitraum9/09/1012/09/10

Research Field

  • Ehemaliges Research Field - Mobility Systems
  • Ehemaliges Research Field - Digital Safety and Security

Fingerprint

Untersuchen Sie die Forschungsthemen von „SmartCountplus - Towards Automated Counting and Modelling of Non-Motorised Traffic with a Stand-Alone Sensor Device“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren