COM2009-3009 Robotics: Lecture 7
Document Details
Uploaded by DiplomaticUvite
The University of Sheffield
Dr Tom Howard
Tags
Summary
This Sheffield University robotics lecture discusses reaching, grasping, and visual servoing. It covers forward and inverse kinematics, and how to control robot arms. The lecture also touches upon the use of sensors and image-based visual servoing.
Full Transcript
© 2023 The University of Sheffield COM2009-3009 Robotics Lecture 7 Reaching and Grasping Dr Tom Howard Multidisciplinary Engineering Education (MEE) COM2009-3009 Robotics: Lecture 7 slide 1 © 2023 The University of Sheffield Industrial Robots Dominate https://www.therobotreport.com/latest-rese...
© 2023 The University of Sheffield COM2009-3009 Robotics Lecture 7 Reaching and Grasping Dr Tom Howard Multidisciplinary Engineering Education (MEE) COM2009-3009 Robotics: Lecture 7 slide 1 © 2023 The University of Sheffield Industrial Robots Dominate https://www.therobotreport.com/latest-researchreport-shows-10-4-cagr-for-robotics-to-2025/ COM2009-3009 Robotics: Lecture 7 slide 2 © 2023 The University of Sheffield Industrial Applications COM2009-3009 Robotics: Lecture 7 slide 3 © 2023 The University of Sheffield This lecture will cover 1. Using kinematics for robot arm reaching 2. Image-Based Visual Servoing 3. Grasping COM2009-3009 Robotics: Lecture 7 slide 4 © 2023 The University of Sheffield Kinematic Control of Robot Arms Task: control the position of the end effector by the coordinated action of the robot’s joints and linkages. Data: Joint angles & Linkage lengths Forward kinematics is the process of using the measured joint angles, and specific kinematic equations of a given robot to compute the position of the end-effector. joint angles + K eq’ns -> end eff position Inverse kinematics uses the specific kinematic equations of the robot and computes the joint angles (and linkages) to obtain a desired endeffector position. des end eff position + K eq’ns -> joint angles COM2009-3009 Robotics: Lecture 7 Diagram from www.Thespod.com slide 5 © 2023 The University of Sheffield Kinematic Control of Robot Arms A simple example: Servo motors Potentiometer Controls COM2009-3009 Robotics: Lecture 7 slide 6 © 2023 The University of Sheffield Kinematic Control of Robot Arms Courtesy of Dr Shuhei Miyashita (ACS231 “Mechatronics”) COM2009-3009 Robotics: Lecture 7 slide 7 © 2023 The University of Sheffield Kinematic Control of Robot Arms moveit.ros.org COM2009-3009 Robotics: Lecture 7 slide 8 © 2023 The University of Sheffield Kinematic Control of Robot Arms Pre-defined Target position Reference Input Joint Angles Inverse Kinematics Control Signals Plant/ Process End-Effector Position Controlled Output Open Loop! COM2009-3009 Robotics: Lecture 7 slide 9 © 2023 The University of Sheffield Kinematic Control of Robot Arms Pre-defined Target position Reference Input Closed Loop? (not really) End-Effector Position Joint Angles Inverse Kinematics Control Signals Controlled Output Plant/ Process Measured Joint Angles Feedback (Forward Kinematics) Indirectly estimated end-effector position COM2009-3009 Robotics: Lecture 7 slide 10 © 2023 The University of Sheffield Kinematic Control of Robot Arms • Common errors: – Sensor Errors: • Sensors drift, fail or are simply not able to monitor the output directly (end effector position) – Positional Errors: • Target position ≠ actual position – Controller Errors: • System performance degrades over time or system models aren’t accurate COM2009-3009 Robotics: Lecture 7 slide 11 © 2023 The University of Sheffield Which can be overcome by: High precision sensors $$$! Stiff metal linkages Highly engineered workspace High precision motors Maintenance COM2009-3009 Robotics: Lecture 7 slide 12 © 2023 The University of Sheffield Beyond robot arms Humanoid Robots COM2009-3009 Robotics: Lecture 7 Simulated Agents slide 13 © 2023 The University of Sheffield Inverse Kinematics define: A. The robot’s joint angles B. The end-effector position C. The error between current and desired positions D. The transformation between frame of references COM2009-3009 Robotics: Lecture 7 slide 14 © 2023 The University of Sheffield Summary of Kinematic Control 1. Forward kinematics uses the defined kinematic equations of a system and measured joint angles to calculate the position of the end effector 2. Inverse kinematics uses the defined kinematic equations of a system and a desired end-effector position to derive the desired joint angles (and establish control signals) 3. Applications extend beyond robot arms to humanoid robots and much more 4. Sensing, positional and controller errors limit the applicability of these methods for simpler systems COM2009-3009 Robotics: Lecture 7 slide 15 © 2023 The University of Sheffield Where to find out more robotacademy.net.au COM2009-3009 Robotics: Lecture 7 slide 16 © 2023 The University of Sheffield This lecture: 1. Using kinematics for robot arm reaching 2.Image-Based Visual Servoing 3. Grasping COM2009-3009 Robotics: Lecture 7 slide 17 © 2023 The University of Sheffield Visual Servoing Concept 1. Add a camera to the end-effector of the robot to directly observe the target 2. Minimizing the difference between stored and current views will solve positioning problem 3. Bypasses many positioning, sensing and controller issues. Target position: B The workspace COM2009-3009 Robotics: Lecture 7 slide 18 © 2023 The University of Sheffield Visual Servoing Concept 1. Add a camera to the end-effector of the robot to directly observe the target 2. Minimizing the difference between stored and current views will solve positioning problem 3. Bypasses many positioning, sensing and controller issues. New position: A Target position: B The workspace COM2009-3009 Robotics: Lecture 7 slide 19 © 2023 The University of Sheffield Visual Servoing Concept A 1. Add a camera to the end-effector of the robot to directly observe the target D 2. Minimizing the difference between stored and current views will solve positioning problem 3. Bypasses many positioning, sensing and controller issues. Target position: B C The workspace COM2009-3009 Robotics: Lecture 7 slide 20 © 2023 The University of Sheffield Visual Servoing Concept 1. Add a camera to the end-effector of the robot to directly observe the target 2. Minimizing the difference between stored and current views will solve positioning problem 3. Bypasses many positioning, sensing and controller issues. Challenge: How do we convert errors that are measured in ‘pixel space’ into corrective movement in ‘joint space’? Target position: B The workspace COM2009-3009 Robotics: Lecture 7 slide 21 © 2023 The University of Sheffield Machine Vision 101 A pinhole camera will form an inverted image on a 2-D surface mounted behind the aperture. We define: 𝑢 [u,v] - position of the feature in pixel space 𝒇 - focal length in pixel space (𝑓መ = 𝑓ൗ𝜌) [f=focal distance in m, ρ=pixel distance in m/pixel] [𝑪𝒙 𝑪𝒚 𝑪𝒛 ] - camera position in 3D space [𝝎𝒙 𝝎𝒚 𝝎𝒛 ] - angular rotations about these axes ω𝑥 𝐶𝑥 𝑓መ 𝐶𝑧 COM2009-3009 Robotics: Lecture 7 𝑣 ω𝑧 ω𝑦 𝐶𝑦 slide 22 © 2023 The University of Sheffield Visual Servoing Formalisation New position: A 𝑢′1 , 𝑣′1 𝑢1 , 𝑣1 𝑣 𝑢2 , 𝑣2 𝑢3 , 𝑣3 𝑢′3 , 𝑣′3 𝑢′2 , 𝑣′2 𝑢 COM2009-3009 Robotics: Lecture 7 slide 23 © 2023 The University of Sheffield Visual Servoing Formalisation New position: A 𝑢ሶ 1 , 𝑣ሶ 1 𝑣 𝑢ሶ 2 , 𝑣ሶ 2 𝑢ሶ 3 , 𝑣ሶ 3 𝑢 COM2009-3009 Robotics: Lecture 7 slide 24 © 2023 The University of Sheffield The Image Jacobian The image Jacobian relates the velocity of the camera in 3D space to the velocity of the pixels in the image plane. 𝑢ሶ 1 , 𝑣ሶ 1 The relationship between pixel and camera velocities for 1 point is: Camera velocities Pixel velocity 𝑢ሶ = 𝐽𝑝 𝑢, 𝑣, 𝑍 𝑣ሶ Image Jacobian 𝑣𝑥 𝑣𝑦 𝑣𝑧 𝜛𝑥 𝜛𝑦 𝜛𝑧 𝑣 (v) along the camera x, y, and z axis: and rotational velocities (w) around those axis where 𝑢 𝐽𝑝 𝑢, 𝑣, 𝑍 = መ −𝑓/𝑍 0 𝑢/𝑍 −𝑢𝑣/𝑓መ 0 መ −𝑓/𝑍 𝑣/𝑍 𝑓መ + 𝑣 2 /𝑓መ COM2009-3009 Robotics: Lecture 7 𝑢2 −(𝑓መ + ) 𝑣 𝑓መ −𝑢𝑣/𝑓መ slide 25 −𝑢 © 2023 The University of Sheffield Worked Example Task: For a camera with 𝒇 = 1 what is the velocity of a pixel at [u,v]=[0,0], Z = 100cm when the camera moves left at 5 cm/s? 𝑣 Solution: Define camera motion 𝑣𝑥 𝑣𝑦 𝑣𝑧 𝜛𝑥 𝜛𝑦 𝜛𝑧 = −5 0 0 0 0 0 𝐽𝑝 𝑢, 𝑣, 𝑍 = Compute Jacobian Compute pixel velocity 𝑢ሶ −0.01 = 𝑣ሶ 0 𝑢 −0.01 0 0 −0.01 0 −0.01 0 0 0 1 COM2009-3009 Robotics: Lecture 7 0 0 0 1 −1 0 −1 0 0 0 0 0 −5 0 0 0.05 = 0 0 0 0 slide 26 © 2023 The University of Sheffield Solving for camera velocity The image Jacobian relates the velocity of the camera in 3D space to the velocity of the pixels in the image plane. The relationship between pixel and camera velocities for 3 points is: 𝑣 𝑢 𝑢1ሶ 𝑣1ሶ 𝑢2ሶ 𝑣2ሶ 𝑢3ሶ 𝑣3ሶ 𝑱𝒑 𝑢1 , 𝑣1 , 𝑍1 = 𝑱𝒑 𝑢2 , 𝑣2 , 𝑍2 𝑱𝒑 𝑢3 , 𝑣3 , 𝑍3 𝑣𝑥 𝑣𝑦 𝑣𝑧 𝜛𝑥 𝜛𝑦 𝜛𝑧 re-arranging for camera velocities COM2009-3009 Robotics: Lecture 7 𝑣𝑥 𝑣𝑦 𝑣𝑧 𝜛𝑥 𝜛𝑦 𝜛𝑧 𝑱𝒑 𝑢1 , 𝑣1 , 𝑍1 = 𝑱𝒑 𝑢2 , 𝑣2 , 𝑍2 𝑱𝒑 𝑢3 , 𝑣3 , 𝑍3 −1 slide 27 𝑢1ሶ 𝑣1ሶ 𝑢2ሶ 𝑣2ሶ 𝑢3ሶ 𝑣3ሶ © 2023 The University of Sheffield Homework Task: Your computer vision system has detected features at pixel locations: 𝑢1 , 𝑣1 = 5,5 , 𝑢2 , 𝑣2 = 10,10 , 𝑢3 , 𝑣3 = 15,5 . The same features in the reference image are located at: 𝑢′1 , 𝑣′1 = 15,5 , 𝑢′2 , 𝑣′2 = 20,10 , 𝑢′3 , 𝑣′3 = 25,5 . 𝑣 𝑢 For a camera with 𝒇 = 1, and assuming all points are at Z = 100cm, use the Image Jacobian matrix to compute the camera movement required to return to the desired position. COM2009-3009 Robotics: Lecture 7 slide 28 © 2023 The University of Sheffield Homework Task: Your computer vision system has detected features at pixel locations: 𝑢1 , 𝑣1 = 5,5 , 𝑢2 , 𝑣2 = 10,10 , 𝑢3 , 𝑣3 = 15,5 . The same features in the reference image are located at: 𝑢′1 , 𝑣′1 = 15,5 , 𝑢′2 , 𝑣′2 = 20,10 , 𝑢′3 , 𝑣′3 = 25,5 . 𝑣 For a camera with 𝒇 = 1, and assuming all points are at Z = 100cm, use the Image Jacobian matrix to compute the camera movement required to return to the desired position. COM2009-3009 Robotics: Lecture 7 𝑢 Answer: 𝑣𝑥 𝑣𝑦 𝑣𝑧 𝜛𝑥 𝜛𝑦 𝜛𝑧 = −1000 0 0 0 0 0 slide 29 © 2023 The University of Sheffield Visual Servoing Control Scheme Desired Pixel Positions (direct from reference image) Desired Pixel Velocities Current Pixel Positions (directly measured) COM2009-3009 Robotics: Lecture 7 Convert from pixel to camera velocities: Image Jacobian Camera (i.e. end-effector) position slide 30 © 2023 The University of Sheffield VisP Demo https://www.youtube.com/watch?v=4Se-_LIw51I COM2009-3009 Robotics: Lecture 7 slide 31 © 2023 The University of Sheffield Which method of control would be best suited to this robot arm? A. Kinematic B. Visual Servoing COM2009-3009 Robotics: Lecture 7 slide 32 © 2023 The University of Sheffield Which method of control would be best suited to this robot arm? A. Kinematic B. Visual Servoing COM2009-3009 Robotics: Lecture 7 slide 33 © 2023 The University of Sheffield Summary of Visual Servoing 1. Visual Servoing is a method of robot control that seeks to minimise the error between image features at a “target position” and the same features in the current view. 2. The Image Jacobian allows us to translate desired pixel velocities into camera velocities. 3. As the camera directly measures the position of the end effector and target position it requires no knowledge of its position and can recover sensing and positioning errors. 4. Not perfect: issues covered in next lecture COM2009-3009 Robotics: Lecture 7 slide 34 © 2023 The University of Sheffield Where to find out more 1. Textbook: Peter Corke, Robotics, Vision and Control 2. Key Journal papers: 1. Hutchinson, Seth, Gregory D. Hager, and Peter I. Corke. "A tutorial on visual servo control." IEEE transactions on robotics and automation 12.5 (1996): 651-670. 2. Chaumette, François, and Seth Hutchinson. "Visual servo control. I. Basic approaches." IEEE Robotics & Automation Magazine 13.4 (2006): 82-90. 3. Chaumette, François, and Seth Hutchinson. "Visual servo control. II. Advanced approaches [Tutorial]." IEEE Robotics & Automation Magazine 14.1 (2007): 109-118. 3. Online Resources: Robot Academy COM2009-3009 Robotics: Lecture 7 slide 35 © 2023 The University of Sheffield This lecture: 1. Using kinematics for robot arm reaching 2. Image-Based Visual Servoing 3.Grasping COM2009-3009 Robotics: Lecture 7 slide 36 © 2023 The University of Sheffield Visual Controlled Robot Grasping Task: • Move end-effector to position A • Pick up object • Move to position B Method: • Visual Control only Result: • Not good COM2009-3009 Robotics: Lecture 7 slide 37 © 2023 The University of Sheffield Soft grippers avoid issue https://www.youtube.com/watch?v=86G9DLJEagw Universal Gripper from iRobot COM2009-3009 Robotics: Lecture 7 slide 38 © 2023 The University of Sheffield The importance of touch Task: • Pick up match • Strike match • Blow out flame Method: • Visual • Touch Result: • Good COM2009-3009 Robotics: Lecture 7 slide 39 © 2023 The University of Sheffield Visual Controlled Human Grasping Task: • Pick up match • Strike match • Blow out flame Method: • Visual Control only • Anesthetized fingers: • Blocks all sense of touch from the finger • Does not affect motor control Result: • Not good COM2009-3009 Robotics: Lecture 7 slide 40 © 2023 The University of Sheffield Visual Controlled Human Grasping Task: • Pick up match • Strike match • Blow out flame Patient with long-term neural dystrophy of touch Method: • Visual Control only • Anesthetized fingers: • Blocks all sense of touch from the finger • Does not affect motor control Result: • Not good Humans don’t recover over time COM2009-3009 Robotics: Lecture 7 slide 41 © 2023 The University of Sheffield Complicated Sensory System Over 15,000 sensors in total in the human hand 1. Light touch 2. Sudden disturbances 3. Stretch 4. Mechanical pressure Neuroscience. 2nd edition. Purves D, Augustine GJ, Fitzpatrick D, et al., editors. Sunderland (MA): Sinauer Associates; 2001. COM2009-3009 Robotics: Lecture 7 slide 42 © 2023 The University of Sheffield Adding touch to robot grippers https://www.youtube.com/watch?time_continue=3&v=GJ_Zki8e8Kw COM2009-3009 Robotics: Lecture 7 slide 43 © 2023 The University of Sheffield Which type of gripper? An underwater salvage robot… A. Suction cup B. Inflatable mould C. Standard Hook D. Tactile Hand COM2009-3009 Robotics: Lecture 7 slide 44 © 2023 The University of Sheffield Which type of gripper? An repetitive pick and place task… A. Suction cup B. Inflatable mould C. Standard Hook D. Tactile Hand COM2009-3009 Robotics: Lecture 7 slide 45 © 2023 The University of Sheffield Which type of gripper? A prosthetic arm… A. Suction cup B. Inflatable mould C. Standard Hook D. Tactile Hand COM2009-3009 Robotics: Lecture 7 slide 46 © 2023 The University of Sheffield Summary of Grasping 1. Visual Control alone insufficient for gripping non-standardised objects 2. Soft-grippers can overcome some limitations 3. Touch is essential for human grasping but is in itself a complex system 4. Cutting edge grippers are adding touch for robot and prosthetic applications for better performance COM2009-3009 Robotics: Lecture 7 slide 47 © 2023 The University of Sheffield Where to find out more 1. Dr Hannes Saal, University of Sheffield: Final year projects in this area https://www.sheffield.ac.uk/psychology/staff/academic/hannes_saal 2. Reading Materials: Dahiya RS, Metta G, Valle M, Sandini G. Tactile Sensing—From Humans to Humanoids. IEEE Trans Rob. 2010;26: 1–20. 3. Online Materials: GRABlab at Yale: Jan Peters' lab at TU Darmstadt: COM2009-3009 Robotics: Lecture 7 slide 48 © 2023 The University of Sheffield Next lecture … Local Guidance Strategies COM2009-3009 Robotics: Lecture 7 slide 49 © 2023 The University of Sheffield Homework - Solution Task: Your computer vision system has detected features at pixel locations: 𝑢1 , 𝑣1 = 5,5 , 𝑢2 , 𝑣2 = 10,10 , 𝑢3 , 𝑣3 = 15,5 . The same features in the reference image are located at: 𝑢′1 , 𝑣′1 = 15,5 , 𝑢′2 , 𝑣′2 = 20,10 , 𝑢′3 , 𝑣′3 = 25,5 . 𝑣 For a camera with 𝒇 = 1, and assuming all points are at Z = 100cm, use the Image Jacobian matrix to compute the camera movement required to return to the desired position. COM2009-3009 Robotics: Lecture 7 𝑢 Answer: 𝑣𝑥 𝑣𝑦 𝑣𝑧 𝜛𝑥 𝜛𝑦 𝜛𝑧 = −1000 0 0 0 0 0 slide 50 © 2023 The University of Sheffield Homework - Solution 𝑢1ሶ 𝑣1ሶ 𝑢2ሶ 𝑣2ሶ 𝑢3ሶ 𝑣3ሶ Step 1 – Compute the pixel velocities Step 2 – Build the 6*6 image Jacobian by concatenating the 3*[2*6] image Jacobians = 15 − 5 5−5 20 − 10 10 − 10 25 − 15 5−5 = 10 0 10 0 10 0 −0.01 0.00 0.05 −25 −26 5 0.00 −0.01 0.05 26 −25 −5 −0.01 0.00 0.10 −100 −101 10 𝐽𝑎𝑐𝑜𝑏𝑖𝑎𝑛 = [ ] 0.00 −0.01 0.10 101 −100 −10 −0.01 0.00 0.15 −75 −226 5 0.00 −0.01 0.05 26 −75 −15 COM2009-3009 Robotics: Lecture 7 slide 51 © 2023 The University of Sheffield Homework - Solution Step 3 – Invert a 6*6 matrix is not the point here, so solve programmatically! COM2009-3009 Robotics: Lecture 7 slide 52