Abstract
Talk WithMachines aims to enhance human-robot interaction in safety-critical industrial systems by integrating large/vision language models with robot control and perception. This allows robots to understand natural language commands and perceive their environment. Translating robots' internal states into human-readable text allows operators to gain clearer insights for safer operations. The paper outlines four workflows: low-level control, language-based feedback, visual input, and robot structure-informed task planning, which are presented in a set of experiments. The proposed approach outperforms the prior method in grasping (100% success vs. 90%) and obstacle avoidance (50% success vs. 30%). Supplementary materials are available on the project website: https://talk-machines.github.io.
| Originalsprache | Englisch |
|---|---|
| Titel | Proceedings 2024 Eighth IEEE International Conference on Robotic Computing (IRC) |
| DOIs | |
| Publikationsstatus | Veröffentlicht - 18 Dez. 2024 |
Research Field
- Assistive and Autonomous Systems
Fingerprint
Untersuchen Sie die Forschungsthemen von „TalkWithMachines: Enhancing Human-Robot Interaction Through Large/Vision Language Models“. Zusammen bilden sie einen einzigartigen Fingerprint.Diese Publikation zitieren
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver