Skip to content

2. Creating robot using ROS and Gazebo

Hamsadatta edited this page Mar 8, 2020 · 1 revision

Recap

In our previous article of the series, we introduced the Robot Operating System (ROS). We discussed many topics like the role of ROS as a platform in robotics, different robot types supported in ROS and basic architecture. we took a simple example to demonstrate its capabilities and features and lastly about ROS integration with other external libraries. In this article, we will create a robot model and try to add sensors and other elements so that our robot is ready for simulation. As a prerequisite, it is advisable to go through the previous article and the sources provided for better understanding. This article assumes that the reader has gone through the previous article or have a minimum understanding of ROS.

Introduction

Implementing something directly onto hardware, with a possibility of iteration is a time-consuming process. To avoid this, simulating our models first is the best practice. Simulations help to evaluate the scenario during the development period. It is very convenient to add a feature in our simulation model and test it, but whereas if we implement directly onto the hardware systems, it would bring in additional challenges that would create unnecessary overhead for your project. In our case creating and simulating a robot can help us to understand, how a robot behaves in different scenarios. In our previous article, we spoke about the differential drive robot, today we are going to create one using ROS and Gazebo. Gazebo As said earlier, Gazebo is an open-source robotics 3D simulator based on the ODE physics engine. It can simulate the required dynamics and control for your robot. Gazebo was created by Dr. Andrew Howard and his student Nate Koenig from the university of southern California. The prime aim was to create a Hi-Fi simulator in order to simulate the robots in indoor and outdoor conditions. The development of Gazebo continued for a few years and in 2009, Willow garage, a technology incubator integrated ROS and Gazebo. Ever since Gazebo has become a primary simulator in the ROS community. These are the requirements for simulating a robot in a world. We will discuss each component in detail.

  • Gazebo world
  • URDF based robot model
  • Sensors in URDF model
  • ROS-Gazebo plugins

Creating Gazebo world

To create a Gazebo world there are two simple methods, one method is to use the built-in models from the Gazebo library, or the drag-drop method and the second method is to re-use pre-built open-source world models as per requirements and modify accordingly. If interested, one can custom build an entire model from scratch using any CAD software and export it to Gazebo readable format. Once our model is ready, we save it as “. world”. Eg: “example.world”, “classroom. world”, etc.

Components of URDF

URDFs are generally written in .xacro format which is XML macro language. With xacro, we can construct shorter and more readable XML files by using macros that expand to larger XML expressions. Firstly let's take look at .xacro from our repo (link provided below) to get the gist of how URDF's are written Here there's a lot of XML to parse so let’s take it bit by bit.

<?xml version='1.0'?>
<robot name="kbot" xmlns:xacro="http://www.ros.org/wiki/xacro">`
... 
</robot>

First-line will let our computer know that this is XML, then we have this block. The block has two required parts that we need to tell it, first that we are using xacro, so we need to add the property: xmlns:xacro="http://www.ros.org/wiki/xacro" And second, the name of the model, so that ROS knows what name to call it: name="kbot"

...
<xacro:property name="cameraSize" value="0.05"/>
<xacro:property name="cameraMass" value="0.1"/>
...

Here, we are declaring some constants for xacro, so whenever we call cameraSize, we will get 0.05 Kg.

 <link name="left_wheel">
    <!--origin xyz="0.1 0.13 0.1" rpy="0 1.5707 1.5707"/-->
    <collision name="collision">
      <origin xyz="0 0 0" rpy="0 1.5707 1.5707"/>
      <geometry>
        <cylinder radius="0.1" length="0.05"/>
      </geometry>
    </collision>
    <visual name="left_wheel_visual">
      <origin xyz="0 0 0" rpy="0 1.5707 1.5707"/>
      <geometry>
        <cylinder radius="0.1" length="0.05"/>
      </geometry>
    </visual>
    <inertial>
      <origin xyz="0 0 0" rpy="0 1.5707 1.5707"/>
      <mass value="4"/>
      <inertia
        ixx=".1" ixy="0.0" ixz="0.0"
        iyy=".1" iyz="0.0"
        izz=".1"/>
    </inertial>
  </link>

Now, this snippet is where we start to define our links. Each link is a piece of the robot, and in order to simulate our robot, each link has to have three parts: a visual, a collision, and an inertial component.

The visual component tells gazebo/ROS how to render the parts of the robot on screen. In this case, we are telling it to make a wheel of a certain size.

The collision component tells gazebo/ROS how big the "collisionbox" of the part should be, or where it should be checking for collisions. In many cases it's identical to the visual component, however, if the part is some is a complicated model, it would be better to just draw a box around it - otherwise, your simulations might take a long time than we want them to.

The inertial component tells gazebo/ROS how hard this should be to move, and how much mass it should have. In the inertial component, we need to specify the mass (in kg) and the moments of inertia. For most common shapes good formulas for the moments of inertia can be found at this Wikipedia page - https://en.wikipedia.org/wiki/List_of_moments_of_inertia

<joint type="continuous" name="left_wheel_hinge">
    <origin xyz="0 0.15 0" rpy="0 0 0"/>
    <child link="left_wheel"/>
    <parent link="chassis"/>
    <axis xyz="0 1 0" rpy="0 0 0"/>
    <limit effort="10000" velocity="1000"/>
    <joint_properties damping="1.0" friction="1.0"/>
</joint>

Then there are joints. Joints are used to connect the different links. Joints can be of different types continuous where it can move and fixed where it’s a weld. Then we specify the child link and the parent link where the child is the wheel and the parent is the main body of the robot then we define some properties for the joint such as friction and dampening

Adding sensors to URDF

Now we should add the depth camera sensor and a laser scanner sensor into our URDF. As an example, we will use the .dae model of the Hokuyo laser sensor. For a camera, we can simply add a rectangular box for visualization. The example code is given below.

Laser code snippet

<!-- Hokuyo Laser -->
  <link name="hokuyo">
    <collision>
      <origin xyz="0 0 0" rpy="0 0 0"/>
      <geometry>
    <box size="0.1 0.1 0.1"/>
      </geometry>
    </collision>

    <visual>
      <origin xyz="0 0 0" rpy="0 0 0"/>
      <geometry>
        <mesh filename="package://kbot_description/meshes/hokuyo.dae"/>
      </geometry>
    </visual>

    <inertial>
      <mass value="1e-5" />
      <origin xyz="0 0 0" rpy="0 0 0"/>
      <inertia ixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
    </inertial>
  </link>

  <joint name="hokuyo_joint" type="fixed">
    <axis xyz="0 1 0" />
    <origin xyz=".15 0 .1" rpy="0 0 0"/>
    <parent link="base_link"/>
    <child link="hokuyo"/>
  </joint>

camera code snippet

<!-- camera link -->
  <link name="camera">
    <collision>
      <origin xyz="0 0 0" rpy="0 0 0"/>
      <geometry>
        <box size="${cameraSize} ${cameraSize} ${cameraSize}"/>
      </geometry>
    </collision>

    <visual>
      <origin xyz="0 0 0" rpy="0 0 1.57"/>
      <geometry>
        <box size="${cameraSize} ${cameraSize} ${cameraSize}"/>
      </geometry>
      <material name="green"/>
    </visual>

    <inertial>
      <mass value="${cameraMass}" />
      <origin xyz="0 0 0" rpy="0 0 0"/>
      <box_inertia m="${cameraMass}" x="${cameraSize}" y="${cameraSize}" z="${cameraSize}" />
      <inertia ixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
    </inertial>
  </link>

  <joint name="camera_joint" type="fixed">
    <axis xyz="0 1 0" />
    <origin xyz=".2 0 0" rpy="0 0 0"/>
    <parent link="base_link"/>
    <child link="camera"/>
  </joint>

Adding Gazebo plugins

Once our URDF is ready, the next step is to add the gazebo plugins for the sensors and the differential drive functionality.

Laser plugin

<plugin name="gazebo_ros_head_hokuyo_controller" filename="libgazebo_ros_laser.so">  
        <topicName>/kbot/laser/scan</topicName>
        <frameName>hokuyo</frameName>
</plugin>

camera plugin

</gazebo>
	<plugin name="kinect_camera_controller" filename="libgazebo_ros_openni_kinect.so">
	  <cameraName>camera</cameraName>
	  <alwaysOn>true</alwaysOn>
	  <updateRate>10</updateRate>
	  <imageTopicName>rgb/image_raw</imageTopicName>
	  <depthImageTopicName>depth/image_raw</depthImageTopicName>
	  <pointCloudTopicName>depth/points</pointCloudTopicName>
	  <cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName>
	  <depthImageCameraInfoTopicName>depth/camera_info</depthImageCameraInfoTopicName>
	  <frameName>camera_frame</frameName>
	 </plugin>
</gazebo>

Differential drive plugin

 <gazebo>
    <plugin name="differential_drive_controller" filename="libgazebo_ros_diff_drive.so">
      <legacyMode>false</legacyMode>
      <alwaysOn>true</alwaysOn>
      <updateRate>10</updateRate>
      <leftJoint>left_wheel_hinge</leftJoint>
      <rightJoint>right_wheel_hinge</rightJoint>
      <wheelSeparation>0.4</wheelSeparation>
      <wheelDiameter>0.2</wheelDiameter>
      <torque>10</torque>
      <commandTopic>cmd_vel</commandTopic>
      <odometryTopic>odom</odometryTopic>
      <odometryFrame>odom</odometryFrame>
      <robotBaseFrame>base_link</robotBaseFrame>
    </plugin>
 </gazebo>

After adding all the plugins, we are now ready to move the robot. For moving we use the ROS built-in keyboard teleoperation package. We need to install the package and run the below command in a new terminal. You can use the arrow keys of the keyboard to drive the robot around.

$ rosrun turtlesim turtle_teleop_key

Link for source files

All the code and the source files for the robot can be found in the given GitHub repository.

Repository link- https://github.com/KPIT-OpenSource/KBot

What’s next?

In our next article, we will be simulating the robot that we just created and understand a very important topic called SLAM. The readers are advised to create a custom robot from scratch and utilize a joint state publisher package in ROS for debugging.