Effective remote automated vehicle operation: a mixed reality contextual comparison study

Michael Gafert, Alexander Mirnig, Peter Fröhlich, Vanessa Kraut, Zoja Anzur, Zoja Anzur, Manfred Tscheligi

Research output: Contribution to journalArticlepeer-review


With the increasing pervasion of automated vehicle fleets, there is an equally increased need to provide effective remote operation capabilities for interventions in cases of vehicle malfunction, rough road conditions or unnavigable areas. This shift in the human operator role to that of an observer and occasional teleoperator requires appropriate interaction interfaces. In this paper, we present an in-context analysis of previously proposed teleoperation information requirements realized as concrete user interfaces in a user study (N=16). Participants completed multiple tasks in a mixed approach, controlling an actual miniature vehicle via an XR enabled headset which showed two interface manifestations (minimum and maximum requirements realized). From the results, we derive separate sets of fundamental UI elements per requirement for effective teleoperation in two phases of interaction: orientation and navigation. We conclude with further suggestions to extend teleoperation UIs regarding contexts and tasks beyond the navigation use case used in the study.
Original languageEnglish
Pages (from-to)2321-2338
JournalPersonal and Ubiquitous Computing
Publication statusPublished - 15 Dec 2023

Research Field

  • Former Research Field - Capturing Experience


  • Automated vehicles
  • Remote operation
  • Teleoperation
  • Interfaces
  • Extended reality


Dive into the research topics of 'Effective remote automated vehicle operation: a mixed reality contextual comparison study'. Together they form a unique fingerprint.

Cite this