Robot Path Planning for Object Manipulation
Over the past several weeks we've shown how the DeskBot system
performs Scene segmentation and Object detection. We have also covered the
Hamster robot and its role. In this update lets dive into the robot
operations.
Once the system is capable of perceiving the scene and detect all
the objects present in it, we can enable our robot with multiple behaviors
aimed at manipulating an object and pushing it towards a goal. From the Scene
Segmentation and Object Detection module we get the following,
- Positions and
centroids of the robot
- Positions and
centroid of the detected unique objects
- Table
dimensions and location of the object holder goal points
- Euclidian
distance from the objects to their respective goal holder
Figure below shows a technical overview of how Robot – object
manipulation is achieved. Using all the above information from the scene
segmentation module we can obtain closed loop vision feedback from a scene and
then implement path planning to enable a robot to push an object from its
position to goal along the generated path. As shown in Figure below. Path planning
is achieved through Jump Point Search which is explained in detail in the sections detailed out in this update.
Jump Point Search
Jump point search is an
optimization to the A* search algorithm path finding algorithm for uniform-cost
grids. It reduces symmetries in the search procedure by means of graph pruning,
eliminating certain nodes in the grid based on assumptions that can be made
about the current node's neighbors, as long as certain conditions relating to
the grid are satisfied. As a result, the algorithm can consider long jumps
along straight (horizontal, vertical and diagonal) lines in the grid ,
rather than the small steps from one
grid position to the next as in A*. This makes the algorithm very fast and this
also helps the algorithm reduce the amount of nodes it adds into its queue.
Unlike greedy best first and A* algorithm, Jump Point Search [1] explores all the
directions till it finds the goal in any one of the directions. This may at
times result in the algorithm taking more time than the other two algorithms
especially in the case of very big maps.
Starting
from a point S, JPS starts performing a local search over the main 4 cardinal
directions. For every direction, the algorithm searches for,
- The goal
- A point in which an optimal path changes direction or observes a forced neighbor.
- If a direction ends in an obstacle the direction is completely discarded and not explored any more.
- If a direction contains goal or forced neighbor, that node is added to the open list for more exploration.
Searching
for a forced neighbor is a local search process. At each step only the 8 nodes
surrounding the current node are considered when looking for a forced neighbor.
When the goal is found the jump points (nodes in open list) are connected
through straight horizontal and vertical paths nodes to produce the optimal
path.
In
the implementation of the JPS we assumed every node on the map to be equal
approximately to 1.5 times the size of the robot. The goal positions were fixed
positions on the table depending on the object holders’ locations. The JPS
algorithm is carried out to find a path between the object and the goal and
then the robot manipulated the object to push it along the way-points in the
path to reach the goal.
Figure shows an example search of the Jump Point Search algorithm.
Jump Point Search with Robot - Object Manipulation
Orient
Object - One Hamster robot
will be used to orient object slope to that of the drop position. Orientation is approximated by calculating the slope of the diagonal of the bounding box of the object obtained from YOLO.
Push
Object - Two Hamster robots will work in unison and
will push the objects once oriented towards goal or the path given by JPS.
Important
Note:
Since our
goal is to transport the object from Location A to B. We have to Orient the
Object and push the object along the path to goal. If our
goal was to move robot from A to B then the Orient Object Step is not
required.
Update 7
The next update will show the integration of the Scene Segmentation and Object Detection module with the Robot Path Planning for Object Manipulation module. This would basically bring the whole DeskBot system together and we should see DeskBot declutter and cleaning desks !!! :)
References
[1] https://zerowidth.com/2013/a-visual-explanation-of-jump-point-search.html
Comments
Post a Comment