A Real-time Pedestrian Classification Method for Event-based Dynamic Stereo Vision

Stephan Schraml, Ahmed Nabil Belbachir (Vortragende:r), Norbert Brändle

Publikation: Beitrag in Buch oder TagungsbandVortrag mit Beitrag in TagungsbandBegutachtung

Abstract

This paper proposes a real-time implementation of a clustering and classification method using asynchronous events generated upon scene activities by an event-based dynamic stereo vision system. The inherent detection of moving objects offered by the dynamic stereo vision system comprising a pair of dynamic vision sensors allows event-based stereo vision in real-time and a 3D representation of moving objects. The clustering and classification method exploit the sparse spatio-temporal representation of sensor´s events for real-time detection and separation between moving objects. The method makes use of density and distance metrics for clustering asynchronous events generated by scene dynamics (changes in the scene). It has been evaluated on clustering the events of moving persons across the sensor field of view. The method has been implemented on the Blackfin BF537 from analog device and tested on real scenarios with more than 100 persons. The results show that the resulting asynchronous events can be successfully clustered in real-time and that the classification rate of pedestrians is successful in more than 92% of the cases.
OriginalspracheEnglisch
TitelProceedings Sixth IEEE Workshop on Embedded Computer Vision
Seitenumfang7
DOIs
PublikationsstatusVeröffentlicht - 2010
VeranstaltungSixth IEEE Workshop on Embedded Computer Vision, in conjunction with CVPR2010 -
Dauer: 13 Juni 2010 → …

Konferenz

KonferenzSixth IEEE Workshop on Embedded Computer Vision, in conjunction with CVPR2010
Zeitraum13/06/10 → …

Research Field

  • Ehemaliges Research Field - Mobility Systems
  • Ehemaliges Research Field - Digital Safety and Security

Fingerprint

Untersuchen Sie die Forschungsthemen von „A Real-time Pedestrian Classification Method for Event-based Dynamic Stereo Vision“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren