Abstract
Learning from demonstration (LfD) has emerged as a promising approach enabling robots to acquire complex tasks directly from human demonstrations. However, tasks involving surface interactions on freeform 3D surfaces present unique challenges in modeling and execution, especially when geometric variations exist between demonstrations and robot execution. This paper proposes a novel framework called probabilistic surface interaction primitives (ProSIP), which systematically incorporates the surface path and the local surface features into the learning procedure. An instrumented tool allows seamless recording and execution of human demonstrations. By design, ProSIPs are independent of time, invariant to rigid-body displacements, and apply to any robotic platform with a Cartesian controller. The framework is employed for an edge-cleaning task of bathroom sinks. The generalization capability to various object geometries and significantly distorted objects is demonstrated. Simulations and an experimental setup with a 9-degrees-of-freedom robotic platform confirm the performance.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) |
Pages | 5956-5963 |
DOIs | |
Publication status | Published - 25 Dec 2024 |
Event | 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024) - Abu Dhabi, United Arab Emirates Duration: 14 Oct 2024 → 18 Oct 2024 |
Conference
Conference | 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024) |
---|---|
Country/Territory | United Arab Emirates |
City | Abu Dhabi |
Period | 14/10/24 → 18/10/24 |
Research Field
- Complex Dynamical Systems
Fingerprint
Dive into the research topics of 'ProSIP: Probabilistic Surface Interaction Primitives for Learning of Robotic Cleaning of Edges'. Together they form a unique fingerprint.Prizes
-
IROS Best Application Paper Award
Unger, C. (Recipient), Hartl-Nesic, C. (Recipient), Vu, M. N. (Recipient) & Kugi, A. (Recipient), 17 Oct 2024
Prize: Prize/Award › Best paper award in journal / conference