e-Yantra is a robotics outreach program funded by the Ministry of Education and hosted at IIT Bombay. The goal is to harness the talent of young engineers to solve problems using technology across a variety of domains such as: agriculture, manufacturing, defence, home, smart-city maintenance and service industries.
The “Cosmo Logistic” theme of eYRC 2023-24 is set in a warehouse used for inter-planet logistics from a space station. A robotic arm and mobile robot collaborate to sort and prepare packages to be transported to different planets. In this theme, our team developed an algorithm for sorting packages autonomously with the help of a robotic arm and mobile robot shown in the figure1.
We have learnt to navigate this mobile robot with the help of SLAM (simultaneous localization and mapping) method in a warehouse. We have detected and localised the packages placed on racks, and manipulated the robotic arm to pick them up. In stage 1 of the competition, our team completed this theme in a simulator (Gazebo).
- Robot Operating System 2 (ROS 2)
- Gazebo
- MoveIt 2
- Computer Vision
- Git
- RViz 2
- Nav 2
Simulator + Real Industrial Robot (Remote Access)
- Operating System: Ubuntu 22.04 LTS
- Processor: Eight cores, x86_64 (64-bit x86 Instruction Set)
- Storage: HDD or SSD with at least 50GB of space
- RAM: 8GB or more
- Graphics Card: Dedicated graphics card preferred
- Internet: Minimum speed of 5 Mbps & latency <= 50ms
Team of two to four students from the same college, any year/discipline can register for the competition.
Based on the members’ filled profile details and theme preferences, e-Yantra will assign the best-suited theme for the team.
After the registration and payment, teams will be inducted into Stage 1 of the competition.
Stage 1 is divided into three tasks: Task 0, Task 1, and Task 2. All the tasks will be either in simulation or will be software-based (MOOC).
Top-performing teams will be selected for Stage 2 based on their Stage 1 performance.
| # | Task | Start Date | End Date | Duration |
|---|---|---|---|---|
| 1 | Task 0 | 05 Sep 2023 | 18 Sep 2023 | 2 Weeks |
| 2 | Coding Contest | 05 Sep 2023 | 01 Oct 2023 | ~4 Weeks |
| 3 | Task 1A | 19 Sep 2023 | 16 Oct 2023 | ~4 Weeks |
| 4 | Task 1B | 19 Sep 2023 | 16 Oct 2023 | ~4 Weeks |
| 5 | Task 1C | 19 Sep 2023 | 16 Oct 2023 | ~4 Weeks |
| 6 | Task 2A | 17 Oct 2023 | 10 Nov 2023 | ~4 Weeks |
| 7 | Task 2B | 17 Oct 2023 | 10 Nov 2023 | ~4 Weeks |
In this task, we have applied the primary usage of the Python language. This task is a team contest hosted on CodeChef. We have solved a list of 10 problems. A total of 1475 teams registered for this coding contest:
| # | Problem Description | Code | Points | Success Rate |
|---|---|---|---|---|
| 1 | Palindrome | PAL_PY | 1380 | 58.78% |
| 2 | IFs and FORs with Python | IFFOR1_PY | 1352 | 60.53% |
| 3 | Arithmetic Progression with Lambda function | APLAM1_PY | 1345 | 68.28% |
| 4 | Stars in our pattern | STAR_PY | 1340 | 65.76% |
| 5 | Count the Characters in WORDS | WLEN_PY | 1337 | 63.92% |
| 6 | Decimal to Binary Converter | D2BIN1_PY | 1329 | 65.6% |
| 7 | Calculate the distance between 2 points | DIST1_PY | 1328 | 43.3% |
| 8 | Find the Toppers | SCOR_PY | 1312 | 43.77% |
| 9 | Manage the Inventory | INV_PY | 1279 | 35.42% |
| 10 | Slice the list | SLICE1_PY | 1220 | 32.89 |
The aim of this task is to complete installation, learn about ROS 2, and set up the robot environment for this theme.
This task is divided into three parts:
- Install Ubuntu 22.04 (Jammy Jellyfish) and ROS 2 (Humble).
- Explore the following learning resources to get hands-on experience with ROS 2 and Python.
- Install and build the required packages for this theme.
- Start exploring the robot environment in Gazebo.
In this task, your mission is not to construct a robot from the ground up. Instead, you will be provided with a mobile robot model. Your goal is to perform the robot unboxing ceremony and prepare it for subsequent tasks. Proceed to the task instructions to set up your workspace and ready your robot for its grand debut in the interstellar realm.
-
Problem Statement: The task's objective is to configure the simulation of the warehouse world for the Cosmo Logistic theme on your system. Ensure that the world is properly set up with the required packages.
-
Procedure: Build Packages: Begin by creating a workspace; refer to the link for instructions on how to do so. Once completed, compile and source the packages.
Clone the Cosmo Logistic (CL) Repository: Navigate to your colcon_ws directory and clone the Cosmo Logistic theme repository (Ensure the src folder is empty):
cd ~/colcon_ws
git clone https://github.com/eYantra-Robotics-Competition/eYRC-2023_Cosmo_Logistic ./src/
git checkout tags/v1.0.1 .Note: After task 0, to return to the remaining tasks, use git checkout main. For users without git installed, use the following command for installation:
sudo apt install gitThe Cosmo Logistic package may take some time to clone, depending on your internet speed.
Install Additional Packages: Before building the workspace and applying gazebo changes, use the following commands:
cd ~/colcon_ws/src/ # assuming your workspace is named as colcon_ws
. requirements.shTo source the installed gazebo file:
echo "source /usr/share/gazebo-11/setup.bash" >> ~/.bashrc
source ~/.bashrc # source bashrc as we have made changesNavigate to the colcon_ws directory and build the colcon workspace using the colcon build command.
Note: To build the package in the system, ensure that the terminal is pointing at the ~/colcon_ws directory and not in ~/colcon_ws/src.
cd ~/colcon_ws
colcon buildThe setup is now complete!
After the package has been successfully built, do not forget to source it:
source install/setup.bashFor every new package cloned or created within the src of your colcon workspace, build and source the workspace to proceed.
To avoid sourcing the setup file every time, add this to your bashrc using the following command:
echo "source ~/colcon_ws/install/setup.bash" >> ~/.bashrc
source ~/.bashrc # source bashrc as we have made changesTo run the task 0 launch file, enter:
ros2 launch ebot_description ebot_gazebo_launch.pyThis should open the Gazebo application with the mobile robot (named as ebot) spawned inside a warehouse.

This repository contains three packages (as of now):
aws-robomaker-small-warehouse-world: Contains warehouse rack and package modelsebot_description: Contains mobile robot (ebot) description modeleyantra_warehouse: Contains the warehouse world model
Certainly! Here's the structured content for your GitHub README file:
# Task 1: Repository Update and Setup
Before proceeding with Task 1, make sure to update your repository. If you have already cloned the eYRC - Cosmo Logistics Repo during Task 0, follow these steps. If not, you can directly clone the repo using:
```bash
git clone https://github.com/eYantra-Robotics-Competition/eYRC-2023_Cosmo_Logistic.gitFor those who have made changes and want to preserve them:
- Navigate inside the
srcfolder of your workspace. - Stash your changes using the following command:
git stash- Pull the updated Task 1 commits from the repo:
git pull
git checkout tags/v1.1.1 . # if doesn't work try `git fetch` and try again- Pop your changes from stash:
git stash popNote:
- For those who haven't changed any files, you can directly run
git pulland continue. - For directly starting with Task 1A.
Next, install required ROS packages for Task 1A:
sudo apt-get install ros-humble-joint-state-broadcaster ros-humble-joint-trajectory-controller ros-humble-controller-managerDo a colcon build to build the workspace with the updated packages.
Note: After any changes in any files, to reflect the same on the workspace, you need to do colcon build.
If you want to skip the step of building every time a file (python script) is changed (note: this doesn't include the addition of files or folders), you may use the colcon build --symlink-install parameter with the build command. (Surf over the internet to find more about it)
This task is divided into three parts:
Locate Aruco in the world with respect to the robotic arm.
Manipulate the robotic arm using the Moveit framework.
Navigate eBot in the Gazebo environment using Nav2.
Welcome to the learning resources for Task 1! In this section, we'll guide you through the essentials of computer vision using the OpenCV library.
-
Official OpenCV Tutorials:
-
ROS cv_bridge Tutorial:
-
Aruco Marker Detection Tutorial:
To get started, make sure to install the necessary libraries:
-
Install "pip3":
sudo apt install python3-pip
-
Install "OpenCV2 Python and Numpy":
pip3 install opencv-contrib-python==4.7.0.72 pip3 install numpy==1.21.5
An Aruco marker, when viewed in an image, is a quadrilateral shape defined by some set of pixels. The properties of the shape, such as corners, area, length of sides, diagonals, medians, and bisectors, can be utilized for various applications.
The idea is to leverage OpenCV to discover properties of the Aruco marker's position and orientation. Apply your logic to determine the pose of the Aruco marker, enabling the publication of transforms between the Aruco object and the base_link of the robot arm.
Recommended time for completing the setup: 20-24 hrs
In this section, we will learn how to use ROS with the MoveIt2! package to control Robotic Manipulators in the Gazebo Simulator. The skills you acquire here can be applied to control real Robotic Manipulators. The code you write will be translated into actions for actual robots.
Note: Links for each topic are provided as hyperlinks for detailed reference.
-
MoveIt Package
Install all MoveIt packages, which are binary-built and ready for direct deployment:
sudo apt install ros-humble-moveit
-
Joint State Broadcaster Package
This package helps publish joint states to a topic, allowing other dependent nodes to know the state of each joint.
sudo apt install ros-humble-joint-state-publisher
-
Joint Trajectory Controller Package
A controller that sends trajectory messages to the robotic arm, specifying joint angle values, velocity, and effort variables.
sudo apt install ros-humble-joint-trajectory-controller
-
MoveIt Servo Package
This package helps servo the arm using linear velocity input format of a link or actuating all joint angles.
sudo apt install ros-humble-moveit-servo
-
Trimesh Package
A Python package for importing mesh files in RViz, enabling the arm to understand environmental collision in its planning scene.
pip install trimesh
-
ROS 2 Control CLI Package
The ROS 2 control CLI package allows access to each controller in the terminal, similar to accessing ROS 2 topics.
sudo apt install ros-humble-ros2controlcli
The MoveIt Setup Assistant is a graphical user interface for configuring any robot for use with MoveIt. Its primary function is generating a Semantic Robot Description Format (SRDF) file for your robot, which specifies additional information required by MoveIt such as planning groups, end effectors, and various kinematic parameters. Additionally, it generates other necessary configuration files for use with the MoveIt pipeline. To use the MoveIt Setup Assistant, you will need to have a URDF file for your robot.
You can skip the steps not shown in the below information. We highly encourage you to explore the internet for more detailed concepts.
-
Start: (Make sure to source the workspace in the terminal before starting these steps)
ros2 launch moveit_setup_assistant setup_assistant.launch.py
This command will open a window with the GUI of the setup assistant.
-
Select the URDF file of the robotic arm:
- Click on "Create New Moveit Configuration Package."
- Select the URDF file of the UR5 arm named as
ur5_arm.urdf.xacrofrom theur_description/urdfpackage. Note: Make sure to select onlyur5_arm.urdf.xacroand NOT any other xacro file. - Click on the "Load Files" button to load the URDF file.
-
Generate Self-Collision Matrix: (Make sure to keep the bar of sampling density to its maximum value.)
- Click on the "Self-Collisions" pane selector on the left-hand side.
- Click on the "Generate Collision Matrix" button. The Setup Assistant will work for a few seconds before presenting you the results of its computation in the main table.
-
Add Virtual Joints: Virtual joints are used primarily to attach the robot to the world. Add two virtual joints:
- FixedBase: for
base_linkhaving parent link asworld - CamBase: for
camera_linkhaving parent link asworld
- FixedBase: for
-
Add Planning Groups: Planning groups are used to semantically describe different parts of your robot.
- Click on "Add Group" button and provide details to resemble more with hardware.
- Group Name:
ur_manipulator - Kinematics solver:
KDLKinematicPlugin - Click on "Add Joints" and add the specified joints.
- Add links as shown.
- Add a chain by selecting
base_linkasbase_linkandtool0astool0. The final planning group setup should look like the provided window.
- Group Name:
- Click on "Add Group" button and provide details to resemble more with hardware.
-
Label End Effectors:
- Add an end effector with the following info:
- EEF Name:
gripper - Select Group Name:
ur_manipulator - Parent Link:
tool0
- EEF Name:
- Add an end effector with the following info:
-
Checking ros2_control:
- Make sure that ros2_control has position for
command_interfaceand position, velocity forposition_interface.
- Make sure that ros2_control has position for
-
Add Author Information:
- Click on the "Author Information" pane.
- Enter your name and email address.
-
Generate Configuration Files:
- Click on the "Configuration Files" pane.
- Choose a location and name for the ROS package that will be generated containing your new set of configuration files.
- Click on "Generate Package."
- The Setup Assistant will generate and write a set of launch and config files into the directory of your choosing.
All the generated files will appear in the "Generated Files/Folders" tab, and you can click on each of them for a description of what they contain. Finally, you can exit the setup assistant.
In the previous module, you completed the MoveIt! setup assistant and successfully generated MoveIt! configuration files. In this module, you will be using those configuration files for the simulation of the robotic arm.
MoveIt! comes with a plugin for the ROS Visualizer (RViz). The plugin allows you to set up scenes in which the robot will work, generate plans, visualize the output, and interact directly with a visualized robot. We will explore the plugin in this tutorial.
Let's start by visualizing the planning of a robotic arm in RViz. Follow these steps:
-
Launch the demo.launch file from the MoveIt configuration package that you created.
ros2 launch ur5_moveit demo.launch.py
This command will launch the robotic arm in RViz.
-
In RViz, you will see a robotic arm and a MotionPlanning display type.
This setup allows you to interact with the simulated robotic arm, visualize planning scenarios, and generate plans using MoveIt! in the RViz interface.
Feel free to explore different functionalities provided by the RViz plugin to enhance your understanding of the robotic arm's behavior in a simulated environment.
You can use this Markdown code in your README file to guide users through the installation process for ROS Manipulation with MoveIt and related packages.
Note: Package & file names, and content of config files might be slightly different.
Now that we have seen how to use Setup assistant and visualize the movement of the arm in Rviz, let's find out how to simulate it in Gazebo. To make the arm move on Gazebo, we need an interface that will take the commands from MoveIt and convey it to the arm in simulation. This interface, in this scenario, is a controller. The controller is basically a generic control loop feedback mechanism, typically a PID controller, to control the output sent to your actuators.
-
ROS Controllers Configuration
A configuration file called
ros_controllers.yamlhas to be created inside theconfigfolder of theur5_moveitpackage. Remove the previous contents (if any) and paste the configuration given below:# ros_controllers.yaml controller_manager: ros__parameters: update_rate: 2000 joint_trajectory_controller: type: joint_trajectory_controller/JointTrajectoryController joint_state_broadcaster: type: joint_state_broadcaster/JointStateBroadcaster joint_trajectory_controller: ros__parameters: command_interfaces: - position state_interfaces: - position - velocity joints: - elbow_joint - shoulder_lift_joint - shoulder_pan_joint - wrist_1_joint - wrist_2_joint - wrist_3_joint state_publish_rate: 100.0 action_monitor_rate: 20.0 allow_partial_joints_goal: false constraints: stopped_velocity_tolerance: 0.0 goal_time: 0.0 joint_state_broadcaster: ros__parameters: type: joint_state_broadcaster/JointStateBroadcaster
-
MoveIt Controllers Configuration
A configuration file called
moveit_controllers.yamlhas to be created inside theconfigfolder of theur5_moveitpackage. Remove the previous contents (if any) and paste the configuration given below:# moveit_controllers.yaml controller_names: - joint_trajectory_controller joint_trajectory_controller: action_ns: follow_joint_trajectory type: FollowJointTrajectory default: true joints: - shoulder_pan_joint - shoulder_lift_joint - elbow_joint - wrist_1_joint - wrist_2_joint - wrist_3_joint
-
OMPL Planning Configuration
A configuration file called
ompl_planning.yamlhas to be created inside theconfigfolder of theur5_moveitpackage. Remove the previous contents (if any) and paste the configuration given below:# ompl_planning.yaml planning_plugin: 'ompl_interface/OMPLPlanner' request_adapters: >- default_planner_request_adapters/AddTimeOptimalParameterization default_planner_request_adapters/FixWorkspaceBounds default_planner_request_adapters/FixStartStateBounds default_planner_request_adapters/FixStartStateCollision default_planner_request_adapters/FixStartStatePathConstraints start_state_max_bounds_error: 0.1
-
Servo Information Configuration
A configuration file called
ur_servo.yamlhas to be created inside theconfigfolder of theur5_moveitpackage. Remove the previous contents (if any) and paste the configuration given below:# ur_servo.yaml ############################################### # Modify all parameters related to servoing here ############################################### use_gazebo: true # Whether the robot is started in a Gazebo simulation environment ## Properties of incoming commands command_in_type: "speed_units" # "unitless"> in the range [-1:1], as if from joystick. "speed_units"> cmds are in m/s and rad/s scale: # Scale parameters are only used if command_in_type=="unitless" linear: 0.6 # Max linear velocity. Meters per publish_period. Unit is [m/s]. Only used for Cartesian commands. rotational: 0.3 # Max angular velocity. Rads per publish_period. Unit is [rad/s]. Only used for Cartesian commands. # Max joint angular/linear velocity. Rads or Meters per publish period. Only used for joint commands on joint_command_in_topic. joint: 0.01 # This is a fudge factor to account for any latency in the system, e.g. network latency or poor low-level # controller performance. It essentially increases the timestep when calculating the target pose, to move the target # pose farther away. [seconds]
-
Launch Gazebo and UR5 Gazebo Spawner: First, you need to launch Gazebo along with the UR5 Gazebo spawner. Open a terminal and run the following command:
ros2 launch ur_description ur5_gazebo_launch.py
This will start Gazebo and load the UR5 robot.
-
Launch MoveIt! and UR5 MoveIt Spawner: After launching Gazebo, open a new terminal and run the command:
ros2 launch ur5_moveit spawn_ur5_launch_moveit.launch.py
This will launch MoveIt! along with the UR5 MoveIt spawner.
-
Visualize in Rviz: After launching both Gazebo and MoveIt!, you can visualize the UR5 in Rviz. Open a new terminal and run:
ros2 launch ur5_moveit demo.launch.py
This command will open Rviz and you should be able to see the UR5 robot model. You can interact with the robot in Rviz to plan and execute motions.
-
Add Visualization Tools: As mentioned in the note, you need to add other visualization tools to Rviz. Here are the steps to add tools:
- Click on the "Panels" tab in Rviz.
- Select "MoveIt" from the drop-down menu.
- A MoveIt MotionPlanning plugin panel should appear. You can use this panel to plan and execute motions for the UR5.
With these steps, you should be able to control the UR5 robotic arm using Rviz and MoveIt!. Make sure to follow the instructions carefully.
1. Nav2:
Install the Nav2 packages using your operating system’s package manager:
sudo apt install ros-humble-navigation2
sudo apt install ros-humble-nav2-bringup2. SLAM Toolbox:
Install the SLAM Toolbox that will be used to build the map and can also be used for localization:
sudo apt install ros-humble-slam-toolboxInstall the robot-localization, a collection of state estimation nodes, each of which is an implementation of a nonlinear state estimator for robots moving in 2D or 3D space:
sudo apt install ros-humble-robot-localization4. Others:
Additional tools for your robotic setup:
sudo apt install ros-humble-joint-state-publisher-gui
sudo apt install ros-humble-xacro- Clone the Repository:
First, pull the latest repository of eYRC-2023_Cosmo_Logistic into your workspace. Check for the package
ebot_nav2and verify the file structure.
-
Check SLAM-Toolbox Installation: Ensure that SLAM-Toolbox is installed:
ros2 pkg list | grep slam_toolboxIf not installed, run the following command:
sudo apt install ros-humble-slam-toolbox
-
Set SLAM-Toolbox Parameters: Set the parameters for SLAM-Toolbox in the file
mapper_params_online_async.yamllocated in the directory/ebot_nav2/config/.# ROS Parameters odom_frame: odom map_frame: map base_frame: base_footprint scan_topic: /scan mode: mapping
-
Load Parameters and Add SLAM-Toolbox Node: Load the
mapper_params_online_async.yamlfile and add the SLAM-Toolbox node in theebot_bringup_launch.pylaunch file located in the directory/ebot_nav2/launch/. Ensure that it's already added in the launch file.# Loading the params declare_mapper_online_async_param_cmd = DeclareLaunchArgument( 'async_param', default_value=os.path.join(ebot_nav2_dir, 'config', 'mapper_params_online_async.yaml'), description='Set mappers online async param file') # Adding SLAM-Toolbox with online_async_launch.py mapper_online_async_param_launch = IncludeLaunchDescription( PythonLaunchDescriptionSource( os.path.join(get_package_share_directory('slam_toolbox'), 'launch', 'online_async_launch.py'), ), launch_arguments=[('slam_params_file', LaunchConfiguration('async_param'))], )
-
Add Launch Configuration Object: Add the following lines as the object of
LaunchDescription()in the same launch file, at the end.ld.add_action(declare_mapper_online_async_param_cmd) ld.add_action(mapper_online_async_param_launch)
Save the file.
-
Launch eBot in Gazebo, Teleop, and Navigation: Launch the required components:
ros2 launch ebot_description ebot_gazebo_launch.py ros2 run teleop_twist_keyboard teleop_twist_keyboard ros2 launch ebot_nav2 ebot_bringup_launch.py
You will see the output indicating successful mapping. Move the eBot using
teleop_twist_keyboardin the warehouse to generate the complete map.
-
Build a Complete Map: Ensure that the map generated has no empty cells in the arena or in the middle. Congratulations on creating your first map!
-
Save the Map in RViz:
- Open RViz and navigate to Panels.
- Under "Add New Panels," select
SlamToolboxPluginunderslam_toolbox.
-
Save the Map:
- In the
SlamToolboxPluginpanel, enter the map name in the first two rows. - Click on both "Save Map" and "Serialize Map."
This action will save the map in the currently opened directory, i.e.,
/colcon_ws/. - In the
-
Copy Map Files to
ebot_nav2Package:- Locate the saved map files (
map.data,map.pgm,map.posegraph, andmap.yaml) in the/colcon_ws/. - Copy these files to the
maps/directory of theebot_nav2package, i.e.,/ebot_nav2/maps/.
- Locate the saved map files (
-
Update Map Name in Launch Files:
- Update the map name in the
ebot_bringup_launch.pylaunch file located in/ebot_nav2/launch/.
declare_map_yaml_cmd = DeclareLaunchArgument( 'map', default_value=os.path.join(ebot_nav2_dir, 'maps', 'map_name.yaml'), ## Update the map config here description='Full path to map yaml file to load')
- Update the map name in the
-
Update Map Name in Parameters:
- Update the map name in the
nav2_params.yamllaunch file located in/ebot_nav2/params/.
map_server: ros__parameters: use_sim_time: True yaml_filename: "map_name.yaml" ## Update the map name here
- Update the map name in the
-
Rebuild the
ebot_nav2Package:- Since you added map files to the
ebot_nav2package, you need to rebuild it.
colcon build
OR
colcon build --symlink-install
- Since you added map files to the
The goal of this task is to perform pose estimation of package boxes using Aruco marker detection. The task involves locating package boxes in a warehouse using computer vision, specifically Aruco detection. Each package box is equipped with a unique Aruco marker. The objective is to selectively identify boxes within the robot arm's reach. Follow the instructions to set up your workspace and prepare the robotic arm for its first movement.
Set up your ROS workspace and ensure all necessary dependencies are installed.
You will be provided with a UR5 robotic arm model situated in a warehouse. The warehouse contains racks and package boxes. The task involves detecting the Aruco markers on these boxes.
- Utilize the OpenCV library to detect Aruco markers on the package boxes.
- Implement logic to differentiate between different markers and uniquely identify each package box.
- Once Aruco markers are detected, employ pose estimation techniques to determine the position and orientation of each package box.
- The goal is to calculate the transformation between the Aruco marker's center position on the package box and the
base_linkof the robot arm.
- Publish the calculated transform between Aruco marker centers and the
base_linkof the robot arm. - Visualize the detected package boxes and their poses in RViz for verification.
- Apply logical and mathematical reasoning to ensure accurate pose estimation.
- Handle different orientations, distances, and configurations of the Aruco markers.
- Clearly document your code, explaining the logic behind marker detection and pose estimation.
- Provide details on the mathematical concepts used for pose estimation.
- Test the pose estimation on different scenarios within the warehouse.
- Validate the accuracy of the detected poses by comparing them with ground truth if available.
- Submit your well-documented code along with any supporting files.
- Include instructions on how to run and test your code.
- Implement a method to dynamically update the pose estimation as the robot arm moves or as the environment changes.
- Explore advanced computer vision techniques for more robust detection and pose estimation.
Objective:
The objective of this task is to develop a Python script for detecting Aruco markers on package boxes using RGB and Depth image data from the robot's camera. The script should publish the transform between the Aruco marker's center position and the base_link frame of the robot arm. Additionally, the detected Aruco tags and their center points should be displayed on the RGB image using the OpenCV library.
Instructions:
-
Boilerplate Script:
- Start with the provided boilerplate script named
task1a.pylocated in thescriptsfolder of theur_descriptionpackage. This script provides guidance on the steps and instructions to complete Task 1A.
- Start with the provided boilerplate script named
-
Dependencies Installation:
-
Ensure that the required ROS packages are installed by running the following command in the terminal:
sudo apt-get install ros-humble-joint-state-broadcaster ros-humble-joint-trajectory-controller ros-humble-controller-manager
-
-
Environment Setup:
-
Launch the robot in Gazebo and open RViz by running the following commands in separate terminals:
ros2 launch ur_description ur5_gazebo_launch.py
ros2 launch ur_description spawn_ur5_launch.py
-
Keep these terminals running.
-
-
Run the Python Script:
-
Execute your Task 1A Python script in a new terminal. Replace the package name and filename if you are not using the provided boilerplate script.
ros2 run ur_description task1a.py
-
-
Visualization in RViz:
-
In RViz:
- Set the fixed frame as "world."
- Add a new display or press Ctrl + N.
- Select "TF" in the "By display type" section.
- Add "PointCloud2" from the "By display type" section.
-
You will see all TF frames published on the "world" frame. Uncheck 'All Enabled' and only select
obj_<marker_id>(the final box frame you are publishing) in the TF frames list.
-
-
Expected Output:
-
Documentation:
- Ensure that your code is well-documented, explaining the logic behind Aruco marker detection and pose estimation.
-
Testing and Validation:
- Test the script on different scenarios within the warehouse.
- Validate the accuracy of the detected poses.
-
Additional Notes:
- Use OpenCV2 for image processing. Other libraries can be chosen if confident but may not receive support from the e-Yantra team.
- Handle duplicate boxes with the same Aruco IDs to avoid processing boxes that are not in the reach of the robot arm.
Points to Note:
-
Z-Axis Orientation:
- In the image window, the Z-axis is drawn away from the box. However, for practical use in the robot arm's picking motion (Task 1B), the Z-axis in the TF should be oriented towards the box. Ensure that when sending the transform, the Z-axis is inverted, pointing towards the box.
-
Image Window Annotations:
- Draw Aruco tag borders on the image window.
- Mark a small circular dot representing the center position of the tag.
- Display the Aruco tag ID along with the tag.
-
Handling Difficulty:
- There might be package boxes placed at the back of the front rack with Aruco IDs. This is intentionally done to increase the task difficulty. Ensure that your code:
- Avoids drawing axes for such package boxes.
- Does not send a transform with respect to
base_linkfor boxes that are away from the robot arm's reach position.
- There might be package boxes placed at the back of the front rack with Aruco IDs. This is intentionally done to increase the task difficulty. Ensure that your code:
-
TF Orientation Example:
- The TF orientation should resemble the example above, with the Z-axis pointing towards the box.
Implementation Tips:
- Consider using OpenCV functions to draw Aruco tag borders, center points, and annotations on the image.
- Be cautious about duplicate Aruco IDs and handle them appropriately.
- Test your implementation in various scenarios to ensure robustness.
Note: Before attempting the task, make sure you have gone through the learning resources on robotic arm manipulation and Moveit framework.
In Task 1B, your objective is to perform motion planning for the UR5 robotic arm using the Moveit framework. The task involves reaching specific positions or joint angles to prepare the arm for the future task of picking boxes from racks.
-
Launch the Environment:
-
First, launch the UR5 robot in Gazebo along with RViz. Use the following commands in separate terminals:
ros2 launch ur_description ur5_gazebo_launch.py
ros2 launch ur_description spawn_ur5_launch.py
-
-
Run the Moveit Configuration Script:
-
Run the Moveit configuration script to configure the Moveit framework for the UR5 robot. Use the following command:
ros2 launch ur5_moveit_config ur5_moveit_planning_execution.launch.py sim:=true
-
-
Launch RViz:
-
Open RViz to visualize the robot model and the motion planning process:
ros2 launch ur5_moveit_config moveit_rviz.launch.py config:=true
-
-
Run the Python Script:
-
Use a Python script to command the UR5 arm to reach specific positions or joint angles. You can write a script that utilizes the Moveit Python API or use the MoveGroupCommander for Python. For example:
ros2 run ur5_moveit_config move_group_python_interface.py
-
Ensure that your Python script contains the necessary logic to plan and execute arm movements.
-
-
Visualize the Motion:
- In RViz, you should be able to visualize the planned motion of the UR5 arm. Make sure the arm reaches the specified positions or joint angles accurately.
-
Documentation:
- Document the positions or joint angles you are instructing the arm to reach in your README file.
- Explain any key decisions or considerations in your approach.
Note: Before attempting the task, make sure you have gone through the learning resources on the manipulation of robotic arm.
In Task 1B, your objective is to create a Python script to control the UR5 robotic arm's movement to pick and drop objects in a specified order. You will be moving the arm to Pick Position 1 (P1), Drop Position (D), and Pick Position 2 (P2).
-
Launch Gazebo and Moveit:
-
Start Gazebo waiting for UR5 to spawn using the following command:
ros2 launch ur_description ur5_gazebo_launch.py
-
Use the launch file created during the Moveit setup assistant, i.e.,
spawn_ur5_launch_moveit.launch.py, to spawn the UR5 arm in Gazebo and RViz:ros2 launch ur_description spawn_ur5_launch_moveit.launch.py
-
-
Write the Python Script:
- Create a single Python script that moves the UR5 arm in the specified order: P1 → D → P2 → D.
- You are free to use the servoing method, or let Moveit plan (actuate directly by giving pose or joint angles using move_group action), or a combination of both.
- Feel free to add any intermediate poses if needed to achieve the completion of the task.
-
Run the Python Script:
-
Start your Python script to execute the arm movements:
ros2 run ur_description task1b.py
-
Ensure that the arm moves in the correct sequence (P1 → D → P2 → D).
-
-
Documentation:
- In your README file, document the logic and approach used in your Python script.
- Explain any key decisions or considerations made during the arm manipulation.
- Include any challenges faced and how you addressed them.
-
Visualization:
- Visualize the arm movements in RViz to ensure they align with the specified positions.
Task Objective: Cosmo Logistic eBot Navigation with ROS2 Navigation Stack (Nav2). The goal is to set up the ROS2 Navigation Stack (Nav2), learn the tools provided by it, and navigate a robot within a lunar warehouse.
-
Install the necessary packages using your operating system’s package manager. Include these in your README file for others to replicate:
sudo apt install ros-humble-navigation2 sudo apt install ros-humble-nav2-bringup sudo apt install ros-humble-slam-toolbox sudo apt install ros-humble-robot-localization sudo apt install ros-humble-joint-state-publisher-gui sudo apt install ros-humble-xacro
- Pull the latest repository of
eYRC-2023_Cosmo_Logisticin your workspace.
- Check for the
ebot_nav2package in the repository.
- Set the parameters for slam-toolbox in the
mapper_params_online_async.yamlfile in theebot_nav2/config/directory.
# ROS Parameters
odom_frame: odom
map_frame: map
base_frame: base_footprint
scan_topic: /scan
mode: mapping- Load the
mapper_params_online_async.yamland add the slam_toolbox node inebot_bringup_launch.pylaunch file in theebot_nav2/launch/directory.
# Loading the params
declare_mapper_online_async_param_cmd = DeclareLaunchArgument(
'async_param',
default_value=os.path.join(ebot_nav2_dir, 'config', 'mapper_params_online_async.yaml'),
description='Set mappers online async param file')
# Adding slam-toolbox, with online_async_launch.py
mapper_online_async_param_launch = IncludeLaunchDescription(
PythonLaunchDescriptionSource(
os.path.join(get_package_share_directory('slam_toolbox'), 'launch', 'online_async_launch.py'),
),
launch_arguments=[('slam_params_file', LaunchConfiguration('async_param'))],
)
ld.add_action(declare_mapper_online_async_param_cmd)
ld.add_action(mapper_online_async_param_launch)-
Launch Gazebo, teleop, and the navigation using the following commands:
ros2 launch ebot_description ebot_gazebo_launch.py ros2 run teleop_twist_keyboard teleop_twist_keyboard ros2 launch ebot_nav2 ebot_bringup_launch.py
-
Ensure that the robot moves and maps the environment as it navigates.
-
Build the map without any empty cells in the arena.
-
Go to RViz, add a new panel, and select
SlamToolboxPluginunderslam_toolbox. -
Enter the map name in both the first two rows and click on both "Save Map" and "Serialize Map."
-
Copy the map files to the
ebot_nav2/maps/directory. -
Update the map name in
ebot_bringup_launch.pyandnav2_params.yamllaunch files. -
Rebuild the
ebot_nav2package.colcon build
Note: Before attempting the task, make sure you have gone through the Learning resources - Autonomous Navigation.
Important Note: Pull the latest repo of eYRC-2023_Cosmo_Logistic and do the colcon build for this task.
For this task, you need to write a single Python script ebot_nav_cmd.py to navigate the eBot through specified poses. The poses include both position and orientation ([x, y, yaw]) in the following order:
- P1: [1.8, 1.5, 1.57]
- P2: [2.0, -7.0, -1.57]
- P3: [-3.0, 2.5, 1.57]
The allowed tolerance is:
-
Mapping:
- Map the warehouse using
slam_toolbox. Follow the resources provided in the Mapping subsection of ROS2 Navigation in the Learning Resources Section. - Create and save the map with a name like
map_name.pgm.
- Map the warehouse using
-
Navigation Script (
ebot_nav_cmd.py):- Write a Python script to navigate the eBot through the specified poses.
- Use the Simple Commander API by Nav2 to create the script.
- Refer to the Navigation subsection of ROS2 Navigation in the Learning Resources Section for guidance.
- Ensure that the script considers the specified poses and tolerances.
Hint:
- You can refer to the Simple Commander API by Nav2 to create the script.
- Tune the Nav2 parameters in
nav2_param.yamlto achieve the desired navigation behavior.
Note: Before starting Task 2, make sure to update your repository. Navigate inside the src folder of your workspace and execute the following commands:
git stash
git pull
git stash popIf you encounter any issues with plugins, try the following:
pip3 install cryptocode
sudo cp libgazebo_link_ur5_attacher.so /usr/lib/x86_64-linux-gnu/gazebo-11/plugins/
sudo cp libgazebo_link_attacher.so /usr/lib/x86_64-linux-gnu/gazebo-11/plugins/Colcon Build:
After making changes in any files, you need to rebuild the workspace using:
colcon buildIf you want to skip the step of building every time a file is changed, you may use the colcon build --symlink-install parameter.
Your goal in this part is to locate Aruco markers in the world with respect to the arm and pick the boxes.
In this part, you will work on moving the rack around using eBot.
Instructions:
1.Write a Python script to detect Aruco markers using image data from the camera and publish TF (transform). 2.Subscribe to camera topics to get RGB and Depth images for Aruco detection. The camera plugins will publish image data in ROS Image data format. 3.Detect Aruco tags present on each package box within the camera/robot arm's vicinity. 4.Find the center pose (position and orientation) of these package boxes and send transforms between the box/object frame (center position of Aruco tag) and the base_link of the robot arm. 5.Implement a motion planning algorithm to pick the boxes. Ensure that the motion planning takes into account the Aruco marker information and the current state of the robot arm. 6.Save the detected Aruco tags with their IDs and positions in a file for future reference. 7.Implement a strategy to handle the variability in the box positions, considering that the boxes will be spawned randomly at different positions each time you launch.
Task 2B - Docking
Objective:
Implement a docking mechanism for the eBot using ultrasonic sensors and IMU data to align its orientation and position with a rack.
Instructions:
Utilize the provided ultrasonic sensors data from topics /ultrasonic_rl/scan (left) and /ultrasonic_rr/scan (right). Extract the distance readings to the rear of the eBot.
Use the IMU data from the topic /imu to get the orientation of the eBot.
Implement a docking service using the custom service message provided under ebot_docking/srv/DockSw. The service should take parameters such as linear_dock, orientation_dock, distance, orientation, and rack_no.
The docking service should align the orientation of the eBot with the rack based on the IMU data. You can calculate the angular difference and use a P-controller to achieve the alignment.
If linear_dock is set to true, use the ultrasonic sensors' distance readings to perform linear correction and move the eBot closer to the rack.
Ensure that the docking mechanism considers the specific orientation and position of the rack.
Implement both the service server and client in a Python script, following the provided boilerplate at ebot_docking/scripts/ebot_docking_boilerplate.py. Fill in the logic for the P-controller and the docking mechanism.
Service Message:
python Copy code
bool linear_dock # Linear Correction bool orientation_dock # Angular Correction float64 distance # Optional param for distance float64 orientation # Goal Orientation string rack_no # Rack number
bool success # Indicates successful run of triggered service string message # Informational, e.g., for error messages Note:
Research and understand the P-controller for proper orientation alignment. Make use of the ultrasonic sensors' distance readings to implement linear correction. Test the docking mechanism in simulation before deploying it on the physical eBot.
Implement a mechanism to attach and detach the eBot with a rack using Gazebo plugins. The attachment should be performed when the eBot is in close proximity to the rack, simulating the behavior of an electromagnet.
Instructions:
-
Use the provided Gazebo plugin commands to attach and detach the link between the eBot and the rack in the simulation environment.
-
The command to attach the link:
ros2 service call /ATTACH_LINK linkattacher_msgs/srv/AttachLink "{model1_name: 'ebot', link1_name: 'ebot_base_link', model2_name: 'rack1', link2_name: 'link'}"- Modify
model2_namewith the actual name of the rack model.
- Modify
-
The command to detach the link:
ros2 service call /DETACH_LINK linkattacher_msgs/srv/DetachLink "{model1_name: 'ebot', link1_name: 'ebot_base_link', model2_name: 'rack1', link2_name: 'link'}"- Modify
model2_namewith the actual name of the rack model.
- Modify
-
Create a Python client script to call these services for attaching and detaching the link. You can refer to the provided sample snippet.
Note:
- Ensure that the eBot is in close proximity to the rack (within <0.3m) before attempting to attach the link.
- Test the attachment and detachment mechanism in simulation to ensure its correctness.
.
| Team ID | Name | Branch | Email ID |
|---|---|---|---|
| CL#2277 | Soumo Roy | ECE | [email protected] |
| CL#2277 | Joel Viju | ECE | [email protected] |
| CL#2277 | Anirudh Singareddy | CSE | [email protected] |
| CL#2277 | Rohan Raj | ECE | [email protected] |









