A control system based on multiple sensors is proposed for the safe collaboration of a robot with a human. New constrained and contactless human-robot coordinated motion tasks are defined to control the robot end-effector so as to maintain a desired relative position to the human head while pointing at it. Simultaneously, the robot avoids any collision with the operator and with nearby static or dynamic obstacles, based on distance computations performed in the depth space of a RGB-D sensor. The various tasks are organized with priorities and executed under hard joint bounds using the Saturation in the Null Space (SNS) algorithm. A direct human-robot communication is integrated within a mixed reality interface using a stereo camera and an augmented reality system. The proposed system is significant for on-line, collaborative quality assessment phases in a manufacturing process. Various experimental validation scenarios using a 7-dof KUKA LWR4 robot are presented.
NOTICE: this is the author’s version of a work that was accepted for publication in Robotics and Computer-Integrated Manufacturing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Robotics and Computer-Integrated Manufacturing, 67, (2021)
© 2021, Elsevier. Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/
- control systems
- human-robot interaction
- Mixed Reality