Language-driven closed-loop grasping with model-predictive trajectory optimization

Huy-Hoang Nguyen, Minh Nhat Vu, Florian Beck, Gerald Ebmer, Anh Nguyen, Wolfgang Kemmetmüller, Andreas Kugi

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

Abstract

Combining a vision module inside a closed-loop control system for the seamless movement of a robot in a manipulation task is challenging due to the inconsistent update rates between utilized modules. This task is even more difficult in a dynamic environment, e.g., objects are moving. This paper presents a modular zero-shot framework for language-driven manipulation of (dynamic) objects through a closed-loop control system with real-time trajectory replanning and an online 6D object pose localization. We segment an object within 0.5 s by leveraging a vision language model via language commands. Then, guided by natural language commands, a closed-loop system, including a unified pose estimation and tracking and online trajectory planning, is utilized to continuously track this object and compute the optimal trajectory in real time. Our proposed zero-shot framework provides a smooth trajectory that avoids jerky movements and ensures the robot can grasp a nonstationary object. Experimental results demonstrate the real-time capability of the proposed zero-shot modular framework to accurately and efficiently grasp moving objects. The framework achieves update rates of up to 30 Hz for the online 6D pose localization module and 10 Hz for the receding-horizon trajectory optimization. These advantages highlight the modular framework’s potential applications in robotics and human–robot interaction; see the video at language-driven-grasping.github.io.
OriginalspracheEnglisch
Aufsatznummer103335
Seitenumfang8
FachzeitschriftMechatronics
Volume109
DOIs
PublikationsstatusVeröffentlicht - 2025

Research Field

  • Complex Dynamical Systems

Fingerprint

Untersuchen Sie die Forschungsthemen von „Language-driven closed-loop grasping with model-predictive trajectory optimization“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren