The theme that we got was Sahayak Bot. We had to program an Autonomous Ground Vehicle (AGV) to make it capable of autonomously traversing an indoor environment to assist moving objects from one place to another. The scenario was that objects were being moved in e-Yantra’s lab and they were short of manpower so Sahayak bot was used to help move the boxes from one place to another.
Challenges in this theme inlcuded: 2D mapping, 3D mapping, Autonomous Navigation, Perception, Pick and Place.
Fig. - Sahayak Bot
Our team's performance:
Objectives | Score | ~ No. of teams who completed the task | |
---|---|---|---|
Task 0 | To get familiar with Ubuntu & ROS | Accepted | 330 |
Task 1 | To get an overview of Gazebo | 87.33/100 | 270 |
Task 2 Video Submission Link |
To explore Mapping and Navigation in ROS | 90.05/100 | 179 |
Task 3 Link |
To explore Robotic Arm Manipulation in ROS using MoveIt motion planning framework | 100/100 | 139 |
Task 4 Link |
To perform Perception | 100/100 | 70 |
Task 5 Link |
To collaborate learnings of all the previous tasks ie. - Navigate to the correct location to pick up objects using the Navigation pipeline - Detecting objects using perception pipelines - Grasping objects with MoveIt planner - Navigate again to your destination using the Navigation pipeline - Placing the objects in the correct drop box |
31.42/100 | 45 |
Task 6 Original Bonus |
Final task Similar to task 5 with added constraints like 48 hours time-limit, original and bonus configuration | Original 625.8 Bonus 151.67 Code 100/100 |
41 |
Result | We couldn’t get shortlisted for Finals of the competition. e-Yantra has not released overall rankings but we calculate that we are in top 25. Top 7 teams were selected for the Finals. The competition was a good learning experience. |
- ROS
- Gazebo
- MoveIt
- FindObject2d
- Navigation packages(Gmapping, amcl, map server, tf2)
- Eyantra's original repo which contains the original packages
This repository contains the additional ROS packages we built and used to complete the tasks.
- ebot_nav - 2D mapping and navigation
- ebot_perc - Arm manipulation and perception
- ebot_task5 - Main controller
There are total five scripts that are used:
- /ebot_task5/scripts/SB#637_main2_task6.py - Main script that controls the bot
- /ebot_nav/scripts/SB#637_move_base_task5.py - Navigational control
- /ebot_perc/scripts/SB#637_moveit_perc.py - Perception and Arm control
- /ebot_task5/scripts/SB#637_img_dis.py - To display required images with object names and bounding boxes
- /ebot_task5/scripts/SB#637_mainr.py - To meet the remaining output terminal print requirements
All the scripts communicate with each other with the help of topics. The general approach in our solution is that:
- First we take the required objects and destinations in list.
- Find out from where these objects can be picked. Maintain its list.
- Main script then tells the move base node to go to a particular room
- Once reached, main script tells moveit script to detect the objects from that room. Once detected, main script gets to know where the required is kept.
- Then it sends the location to move base node from where the object can be picked up
- Once reached, it tells the moveit script that this is the object that needs to be picked up now. Moveit performs its operations and after picking the required object, informs the main script.
- Once picked, main script sends the destination room goal to the move base node.
- Then, when the ebot has reached its destination, main script commands the moveit to drop the object in the dropbox.
- Then the whole sequence is repeated if more objects are required.
- Along with all this, the other requirements of the rulebook are met by the other two scripts.
To stay on a particular position, drift neutralization is also used. For perception, a faithful session in find_object_2d was created and used.