This video demonstrates our method for “Robot Arm Cartesian Teleoperation from Noisy and Low-FrequencyHuman Position Information”. Through a single RGB-D camera human body 2D pose estimations are obtained from OpenPose, and depth information is extracted from the Point Cloud. Github Publication
Human demonstration for robot movements
Demonstration of a ROS pipeline that allows the tracking of human movements and their replication by a UR3 collaborative robot. The 3D human wrist position in acquired using OpenPose and the associated PointCloud information. The onset and end of the movement are detected and the resulting raw trajectory is smoothed using Bezier Interpolation. The resulting Bezier curve is replicated by UR3 using a velocity control method. Robot commanded velocities are regulated by a P-controller. The robot motion produced by using raw human movement data (without smoothing) is demonstrated for comparison.