COM2009-3009 Robotics: Lecture 8 Local Guidance 2022-23 PDF
Document Details
Uploaded by DiplomaticUvite
The University of Sheffield
2023
Dr Tom Howard
Tags
Summary
This document is a lecture on robotics, specifically focusing on local guidance techniques for robots. It explores different methods like Brownian motion, Levy flights, and spiral search as well as issues such as beaconing and visual homing. The lecture discusses the theoretical foundations and practical applications of various robot navigation techniques.
Full Transcript
© 2023 The University of Sheffield COM2009-3009 Robotics Lecture 8 Local Guidance Dr Tom Howard Multidisciplinary Engineering Education (MEE) COM2009-3009 Robotics: Lecture 8 slide 1 © 2023 The University of Sheffield Moving the whole body Source of images: wikicommons COM2009-3009 Robotics: L...
© 2023 The University of Sheffield COM2009-3009 Robotics Lecture 8 Local Guidance Dr Tom Howard Multidisciplinary Engineering Education (MEE) COM2009-3009 Robotics: Lecture 8 slide 1 © 2023 The University of Sheffield Moving the whole body Source of images: wikicommons COM2009-3009 Robotics: Lecture 8 slide 2 © 2023 The University of Sheffield This lecture will cover 1. Search 2. Beaconing 3. Visual Homing COM2009-3009 Robotics: Lecture 8 slide 3 © 2023 The University of Sheffield 1. Search Strategies Search: The act of searching for someone or something COM2009-3009 Robotics: Lecture 8 slide 4 © 2023 The University of Sheffield When to search? Search and Rescue Mining in Space 1. Absence of guidance cues. 2. Unpredictable distribution & location of targets 3. To explore an area thorough Fishing robot Finding home COM2009-3009 Robotics: Lecture 8 slide 5 © 2023 The University of Sheffield How to search? First, let’s look how it’s done in nature… COM2009-3009 Robotics: Lecture 8 slide 6 © 2023 The University of Sheffield Brownian Motion • Continuous search commonly with small changes in step size but random turns. • Observed in particles and some animals (albatross) searching for local & abundant resources. • This may not be an optimal approach… COM2009-3009 Robotics: Lecture 8 slide 7 © 2023 The University of Sheffield Levy Flights • Levy flights use a long-tailed distribution to select the step size generating clusters of searches interspersed with long paths into new areas • Animals including bees & sharks do this when searching for food/resources • Often more effective than Brownian motion freq step size COM2009-3009 Robotics: Lecture 8 slide 8 © 2023 The University of Sheffield How to search? And now, for robots… COM2009-3009 Robotics: Lecture 8 slide 9 © 2023 The University of Sheffield Spiral Search • Robots without a map of the environment (early Roomba) can use exhaustive search such as spiral search • Hybrid strategies with local spiral and then edge following can give better coverage in a similar way that Levy flights perform spare local searches COM2009-3009 Robotics: Lecture 8 slide 10 © 2023 The University of Sheffield Your robot’s battery level is critical and it needs to find its charging platform, the robot can detect that it is within 3m of it, but that’s all it knows… What search method should be used? A. B. C. D. Brownian Motion Levy Walk Spiral Search Something else COM2009-3009 Robotics: Lecture 8 slide 11 © 2023 The University of Sheffield A fleet of mini-mining robots who are to locate sparsely distributed minerals on the lunar surface. The robots’ sensors can only detect minerals at close range. What search method would you use? A. B. C. D. Brownian Motion Levy Walk Spiral Search Something else COM2009-3009 Robotics: Lecture 8 slide 12 © 2023 The University of Sheffield 2. Beaconing Beaconing / Taxis : Directed movement towards an observable cue COM2009-3009 Robotics: Lecture 8 slide 13 © 2023 The University of Sheffield Beaconing by Reactive Control Braitenberg’s Vehicle 2b: “Aggression” ++ ++ ++ COM2009-3009 Robotics: Lecture 8 slide 14 © 2023 The University of Sheffield What combination of connections would lead to approach and stop behavior? - - ? A. B. C. D. Contralateral (Crossed) positive Contralateral (Crossed) negative Ipsilateral (Uncrossed) positive Ipsilateral (Uncrossed) negative COM2009-3009 Robotics: Lecture 8 slide 15 © 2023 The University of Sheffield Beaconing by Reactive Control - - Braitenberg’s Vehicle 3a: “Love” COM2009-3009 Robotics: Lecture 8 slide 16 © 2023 The University of Sheffield Taxis in nature https://youtu.be/aMeKvWU1CnQ COM2009-3009 Robotics: Lecture 8 slide 17 © 2023 The University of Sheffield And in robots https://robowarner.com/portfolio/autonomous-homing-robot/ COM2009-3009 Robotics: Lecture 8 slide 18 © 2023 The University of Sheffield Beaconing with a single sensor? ? But what about single sensor systems? COM2009-3009 Robotics: Lecture 8 slide 19 © 2023 The University of Sheffield An application of “gradient descent” COM2009-3009 Robotics: Lecture 8 slide 20 © 2023 The University of Sheffield When is beaconing useful? 1. When the target can be perceived 2. Orientation (~distance) from current position deduced Direct observation of the target COM2009-3009 Robotics: Lecture 8 slide 21 © 2023 The University of Sheffield Limitations of beaconing 1. Range issues from occlusions & signal loss with respect to distance – (shrinks the gradient area) 2. Noisy environments – (disrupts the smooth gradient) 3. Can be trapped in local minima – Needs more complex architectures e.g. subsumption COM2009-3009 Robotics: Lecture 8 slide 22 © 2023 The University of Sheffield 3. Visual Homing Homing: To move or be aimed towards an imperceptible target with great accuracy. <- Target COM2009-3009 Robotics: Lecture 8 slide 23 © 2023 The University of Sheffield Inspired by Nature In the 1950s Nico Tinbergen showed that wasps approach their hidden nests using surrounding visual cues e.g. the pattern of surrounding objects. Nico Tinbergen Nobel Prize 1973 Similar behaviours found many animals: • bees • ants • hummingbirds • rats Roboticists & biologists have since developed many homing models. COM2009-3009 Robotics: Lecture 8 slide 24 © 2023 The University of Sheffield Visual Homing: The concept Assumptions: - Visual features will only be perceived at a given position in the visual field from a single location in the environment. - The change in position of visual features is predictable across locations. Current Location Method: - Select the direction (and distance) that will minimise the difference between the current view and a view stored at the target location (“snapshot image”). COM2009-3009 Robotics: Lecture 8 Home – ‘snapshot’ slide 25 © 2023 The University of Sheffield Visual Homing – the concept Assumptions: - Visual features will only be perceived at a given position in the visual field from a single location in the environment. - The change in position of visual features is predictable across locations. Method: - Select the direction (and distance) that will minimise the difference between the current view and a view stored at the target location (“snapshot image”). Catchment Area: - A single view provides guidance across a potentially large range (known as catchment area) Very closely related to visual servoing COM2009-3009 Robotics: Lecture 8 slide 26 © 2023 The University of Sheffield Requirements & Assumptions Panoramic images used for homing 1. Panoramic vision is required 2. Views must be pre-alligned 1. 2. Rotation can be corrected using compass Tilt correction by IMU Common robot setup for visual homing COM2009-3009 Robotics: Lecture 8 slide 27 © 2023 The University of Sheffield Challenges for feature-based methods Common Problems: 1. How to robustly find features? 2. “Correspondence problem”: How to match features between current scene and memory? What if some are missing? COM2009-3009 Robotics: Lecture 8 slide 28 © 2023 The University of Sheffield Method 1 – Average Landmark Vector ALV = Average Landmark Vector Before leaving home 1. Point a unit vector to each “landmark” 2. Compute the average landmark vector and store as ALV_home [-1.4,1.4] 𝐴𝐿𝑉_ℎ𝑜𝑚𝑒 = 𝐴𝐿𝑉_ℎ𝑜𝑚𝑒 = −1.4+0+1.4 1.4−1+1.4 , 3 3 0,0.6 (red arrow) [1.4,1.4] [0,-1] Lambrinos et al, 1998 COM2009-3009 Robotics: Lecture 8 slide 29 © 2023 The University of Sheffield Method 1 – Average Landmark Vector ALV = Average Landmark Vector Before leaving home 1. Point a unit vector to each “landmark” 2. Compute the average landmark vector and store as ALV_home To home: 1. Compute the average landmark vector as above (ALV_now) 2. Compute home vector by vector subtraction: [0.7,-0.7] [-0.7,-0.7] [0,-1] 𝐴𝐿𝑉_𝑛𝑜𝑤 = 𝐴𝐿𝑉_𝑛𝑜𝑤 = −0.7+0+0.7 −0.7−1−0.7 , 3 3 0,−0.9 (blue arrow) 𝐻𝑜𝑚𝑒 𝑉𝑒𝑐𝑡𝑜𝑟 = 𝐴𝐿𝑉𝑛𝑜𝑤 − 𝐴𝐿𝑉ℎ𝑜𝑚𝑒 Lambrinos et al, 1998 COM2009-3009 Robotics: Lecture 8 slide 30 © 2023 The University of Sheffield Method 1 – Average Landmark Vector ALV = Average Landmark Vector Before leaving home 1. Point a unit vector to each “landmark” 2. Compute the average landmark vector and store as ALV_home To home: 1. Compute the average landmark vector as above (ALV_now) 2. Compute home vector by vector subtraction: 𝐴𝐿𝑉_𝑛𝑜𝑤 = 0,−0.9 𝐴𝐿𝑉_ℎ𝑜𝑚𝑒 = 0,0.6 𝐻𝑜𝑚𝑒 𝑉𝑒𝑐𝑡𝑜𝑟 = 𝐴𝐿𝑉𝑐𝑢𝑟𝑟𝑒𝑛𝑡 − 𝐴𝐿𝑉ℎ𝑜𝑚𝑒 𝐻𝑜𝑚𝑒 𝑉𝑒𝑐𝑡𝑜𝑟 = (0-0,-0.9-0.6) 𝐻𝑜𝑚𝑒 𝑉𝑒𝑐𝑡𝑜𝑟 = 0,−1.5 (black arrow) 𝐻𝑜𝑚𝑒 𝑉𝑒𝑐𝑡𝑜𝑟 = 𝐴𝐿𝑉𝑛𝑜𝑤 − 𝐴𝐿𝑉ℎ𝑜𝑚𝑒 Lambrinos et al, 1998 COM2009-3009 Robotics: Lecture 8 slide 31 © 2023 The University of Sheffield ALV on a Mobile Robot Successes: Reproducing animal behaviour: Ants - Sahabot (Lambrinos et al) Crickets – (Mangan & Webb, 2008) ✓ Good for sparse environments ✓ Reduces the correspondence problem Robot vacuum cleaners: Moller Group, Univ Bielefeld (Image Warping) From Moeller et al, 1998, Insect Strategies of Visual Homing in Mobile Robots COM2009-3009 Robotics: Lecture 8 slide 32 © 2023 The University of Sheffield Method 2 – Image Difference Functions Gradient descent in image differences Zeil (2003) showed that the pixel wise difference (e.g. RMS) between home and current images increases with distance giving a gradient that can be followed home. (Recall gradient descent using a single sensor from earlier) ✓ Good for complex environments ✓ No need to detect/match features at all (no correspondence problem) Zeil et al, 2003 COM2009-3009 Robotics: Lecture 8 slide 33 © 2023 The University of Sheffield IDF on a Mobile Robot Distinct Landmarks Blank Walls Natural Scene Mangan & Webb, 2009: https://link.springer.com/article/10.1007/s00422-009-0338-1 COM2009-3009 Robotics: Lecture 8 slide 34 © 2023 The University of Sheffield IDF on a Mobile Robot Example real-world IDFs Distinct Landmarks Blank Walls Natural Scene Example home-vectors using IDFs Mangan & Webb, 2009: https://link.springer.com/article/10.1007/s00422-009-0338-1 COM2009-3009 Robotics: Lecture 8 slide 35 © 2023 The University of Sheffield When is visual homing useful? Long-range line-of-sight provides long distance guidance Short line-of-sight requires many memories (more on this next time!) Final approach to specific location (automated piloting) COM2009-3009 Robotics: Lecture 8 slide 36 © 2023 The University of Sheffield This lecture has covered … 1. Search strategies (when you have nothing) – Brownian, Levy and Spiral Search strategies 2. Beaconing (when you can sense a target) – – – Approach an observable target Reactive strategies sufficient Can explain animal taxis and can solve many robot tasks 3. Visual Homing (when you can observe the local surroundings) – – ALV and IDF methods described Benefits and limitations outlined Q. Q. What What do do all all these these guidance guidance methods methods have have in in common? common? A. A. No No knowledge knowledge ofof current current location! location! COM2009-3009 Robotics: Lecture 8 slide 37 © 2023 The University of Sheffield Further Reading • Ulrich Nehmzow, “Mobile Robotics: A Practical Introduction”, Springer, 2nd edition, 2002. • Bekey, G.A., Autonomous Robots: From biological inspiration to implementation and control, (MIT Press). Chapter 14. • B. Webb, T.R. Consi, (2002) "Biorobotics: Methods and Applications", Industrial Robot: An International Journal , Vol. 29 Issue: 3 COM2009-3009 Robotics: Lecture 8 slide 38 © 2023 The University of Sheffield Next time … Localisation COM2009-3009 Robotics: Lecture 8 slide 39 © 2023 The University of Sheffield References 1. Cartwright, B. A., and T. S. Collett. "How honey bees use landmarks to guide." Nature 295 (1982): 561. 2. Lambrinos, Dimitrios, et al. "Landmark navigation without snapshots: the average landmark vector model." Proc. Neurobiol. Conf. Göttingen. 1998. 3. Hafner, Verena Vanessa. "Adaptive homing—robotic exploration tours." Adaptive Behavior 9.3-4 (2001): 131-141. 4. Mangan, Michael, and Barbara Webb. "Modelling place memory in crickets." Biological cybernetics 101.4 (2009): 307-323. COM2009-3009 Robotics: Lecture 8 slide 40