Data Selection for Reduced Training Effort in Vandalism Sound Event Detection

Stefan Grebien (Autor:in und Vortragende:r), Franz Graf, Ferdinand Fuhrmann, Michael Hubner, Stephan Veigl

Publikation: Beitrag in Buch oder TagungsbandVortrag mit Beitrag in TagungsbandBegutachtung

6 Downloads (Pure)


Typical sound event detection (SED) applications, employed in
real environments, generate huge amounts of unlabeled data
each day. These data can potentially be used to re-train the
underlying machine learning models. However, as the labeling
budget is usually restricted, active learning plays a vital role in
re-training. Especially for applications with sparse event occurrence,
a data selection process is paramount. In this paper we (i)
introduce a novel application for vandalism SED, and (ii) analyze
an active learning scheme for reduced training and annotation
In the presented system, the employed machine learning classifier
shall recognize various acts of vandalism, i.e., glass breakage
and graffiti spraying. To this end, we utilize embeddings generated
with a pre-trained network and train a recurrent neural
network for event detection. The applied data selection strategy
is based on a mismatch-first, farthest-traversal approach and is
compared to an upper bound by using all available data. Furthermore,
results for the active learning scheme are evaluated
with respect to different labeling budgets and compared to an
active learning scheme with a random sampling scheme.
TitelBook of peer-reviewed papers 10th Congress of the Alps Adria Acoustics Association
ISBN (elektronisch)978-961-94085-2-0
PublikationsstatusVeröffentlicht - 21 Sept. 2023
Veranstaltung10th Congress of the Alps Adria Acoustics Association - Izola, Izola, Slowenien
Dauer: 20 Sept. 202321 Sept. 2023


Konferenz10th Congress of the Alps Adria Acoustics Association

Research Field

  • Ehemaliges Research Field - New Sensor Technologies


Untersuchen Sie die Forschungsthemen von „Data Selection for Reduced Training Effort in Vandalism Sound Event Detection“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren