[ROS Q&A] 117 – How to Launch a ROS Industrial Robots Simulation

 

In this video we will see how to launch a complex industrial environment with several robots in it, including ROS industrial robots and service robots.

The simulation contains a UR5 industrial robot and a couple of mobile bases. Also, many types of sensors include, including lasers cameras, and even a conveyor belt simulation.

This amazing simulation was created by the OSRF for their ARIAC competition 2017 using Gazebo simulator

 

[irp posts=”8409″ name=”RDP 006: Using ROS for Industrial Projects With Carlos Rosales”]

Step 1. Create a project in ROS Development Studio(ROSDS)

ROSDS helps you follow our tutorial in a fast pace without dealing without setting up an environment locally. If you haven’t had an account yet, you can create a free account here. You can get the shared project through this link .

Step 2. Run the simulation

We prebuilt the package for this project. You can run the simulation with the following command

source ~/simulation_ws/install/setup.bash
rosrun osrf_gear gear.py --development-mode -f /home/user/simulation_ws/install/share/osrf_gear/config/sample.yaml

You can then open the gazebo simulation from Tools->Gazebo. You should see the whole simulation of a warehouse.

Step 3. Demo

We prepared a demo which you can simply launch with the following command

./catkin_ws/src/demo.sh

After executing, you should see the robots are moving around.

We will use this simulation in a future ROS Developers Live Class #20 for playing with robots in industrial environments:

[irp posts=”9470″ name=”ROS Developers LIVE-Class #20: Simulate an Industrial Environment”]

 

Edit by: Tony Huang

RELATED LINKS
▸ OSRF: https://www.osrfoundation.org/
▸ ARIAC competition: http://gazebosim.org/ariac

Robot Ignite Academy
ROS Industrial online course
ROS Development Studio


 

How_To_Launch_ROS_Industrial_Robots_Simulation_post_course_banner

 

 

[ROS Q&A] 116 – Launching Husarion ROSbot navigation demo in Gazebo simulation

 

In this video we will learn how to install the ROSBot Gazebo simulation in just 5 minutes and how to launch the mapping and navigation demos that it includes.

[irp posts=”8512″ name=”RDP 008: Building Accessible ROSbots With Jack Pien”]

RELATED LINKS
▸ Husarion web page: https://husarion.com/
▸ Husarion ROSbot simulation git: https://github.com/husarion/ROSbot_description

▸ Robot Ignite Academy: https://www.robotigniteacademy.com/
▸ ROS Navigation in 5 days online course: https://goo.gl/bbNXRH
▸ ROS Development Studio: https://www.theconstruct.ai/rds-ros-development-studio/


learn_how_to_install_the_ROS_robot_Gazebo_simulation_ros_navigation_course_banner

[ROS Q&A] 110 – How to launch two drones on a Single Gazebo Simulation

We show you how you can launch two drones (or more) in the same Gazebo simulation, each one having its own independent control system based on ROS. This procedure can be replicated to launch as many drones as required.

RELATED LINKS

– How to start programming drones with ROS: https://youtu.be/f7b5tSZW1Ig
– Hector Quadrotor Simulation: https://bitbucket.org/theconstructcore/hector_quadrotor_sim
– ROS Development Studio: https://goo.gl/Yf2Q4J

[irp posts=”6638″ name=”ROS Q&A | How to Start Programming Drones using ROS”]

 

***


programming-drone-with-ros-course-banner

ROS Q&A | How to Start Programming Drones using ROS

ROS Q&A | How to Start Programming Drones using ROS

In this video answer, we walk through the basics of a Parrot AR Drone Gazebo simulation.
You will learn the topics provided by the simulation and how to use a ROS program to interact, sending commands or reading sensors, with this robot.

[irp posts=”8190″ name=”Performing LSD-SLAM with a ROS based Parrot AR. Drone”]

Let’s get started!

Step1. Create a new project on ROS Development Studio(RDS)

We’ll use ROS Development Studio(RDS) for this tutorial, you can register a free account here.

After logging into RDS, click on create my project. It will move to the public simulation. You can find tons of public simulation here offered by the construct for free and start to work on any of them in just minutes. See how powerful RDS is! For today, we’ll use sjtu_drone_tc project. Please click on it. Click the tools menu, you can find some tools that help you develop in RDS. For example, you have the:

  1.  Shell: it is the terminal where you can execute commands in RDS. You can open it as many as you want in RDS!
  2.  IDE: It’s the best way to explore the source tree of your project. With a right click, you can add or remove files easily.
  3. Jupyter Notebook: You can take notes for your project here. Since it’s working with python shell, you can directly execute python script here. A default notebook is provided to help you start the simulation.
  4. Graphical tool: You can use all the GUIs supported by ROS here(e.g. RViz, rqt_gui…etc.)

Notice:

We are not automatically running the simulation when you start RDS now. In order to have the same simulation shown in the video. Please go Simulations->Select launch file->main.launch to launch it by yourself. Then you can type the following command in a shell to check if the topics are correctly publishing by the drone.

$ rostopic list

 


Step2. Get started with the simulation

Let’s get started by following the instruction in the default jupyter notebook. Open it from tools->jupyter notebook->default.ipynb.

We can make the drone take off with the shell command

$ rostopic pub /drone/takeoff std_msgs/Empty "{}"

You should see the drone take off as soon as you send this command.

Notice:

You can use the ROS auto-completion function while you are typing a ROS command by pressing [TAB]. It’s a good idea to do that when the command is too long and hard to type it correctly.

You can also land the drone with the following command

$ rostopic pub /drone/land std_msgs/Empty "{}"

You can also find an instruction in the default.ipynb shows you how to do it with a python script instead of sending commend from shell.


Step3. Program with drones

We have more examples for you! Let’s say, we want to use the position control function provided by the drone. We found there is a topic called /drone/posctrl, but how to use it? By typing

rostopic info /drone/posctrl

You’ll see the output like this.

Type: std_msgs/Bool

Publishers:
* /my_node (http://ip-172-31-35-31:45972/)

Subscribers:
* /gazebo (http://10.8.0.1:44685/)

It seems that the topic is using the Bool message, but what is Bool message and how can I use it? You can further investigate it by typing

rosmsg show std_msgs/Bool

and got the output

bool data

The Bool message is very simple. It contains only one attribute called data with the type bool. Let’s try to send a message to this topic! Before we publish to the topic, we set up a monitor first with

rostopic echo /drone/posctrl

Then copy, paste and execute the following code in jupyter notebook.

from std_msgs.msg import Bool

var_bool = Bool()
pub_posctrl = rospy.Publisher('/drone/posctrl',Bool,queue_size = 1)
var_bool.data = True
pub_posctrl.publish(var_bool)

You should see

data: True

which means the message is published correctly. We enable the position control function on the drone successfully. Similarly, you can move the drone by publishing Twist message to the /cmd_vel topic. Here is an example script

from geometry_msgs.msg import Twist

var_twist = Twist()
pub_position = rospy.Publisher('/cmd_vel', Twist, queue_size=1)
var_twist.linear.x = 1
var_twist.linear.y = 1
var_twist.linear.z = 2
pub_position.publish(var_twist)

Now you know how to start programming drones easily with RDS! You can do lots of things in RDS(e.g. using the cameras on the drone to implement computer vision algorithms). If you are interested but have no ideas how to do it, you can check our Programming drones with ROS course on robot ignite academy!

 

Edit by Tony Huang

[Links and resources mentioned in the video]

– ROS Development Studio: https://goo.gl/1Qb4AT

– The course of Programming Drones with ROS is available here: https://goo.gl/x6yaZW


programming-drone-with-ros-course-banner

How to create a ROS Sensor Plugin for Gazebo

How to create a ROS Sensor Plugin for Gazebo

 

There are magnificent tutorials about how to create plugins for Gazebo in the GazeboSim webpage. There are even some tutorials about how to create plugins for Gazebo + ROS. Those tutorials show that there are several types of plugins (world, model, sensor, system, visual), and indicate how to create a plugin for a world type plugin.

Recently I need to create a plugin for a light detector. Reading the tutorials, I missed a concrete example about how to create a sensor plugin. Hence, I had to investigate a little bit about it. The result is the content of this post.

 

How to: light sensor plugin in Gazebo

Following the indications provided at the answers forum of Gazebo, I decided to build a very simple light detector sensor based on a camera. Instead of using a raytracing algorithm from lights, the idea is to use a camera to capture an image, then use the image to calculate the illuminance of the image, and then publish that illuminance value through a ROS topic.

Since the plugin is for its use with ROS, the whole plugin should be compilable using a ROS environment. Hence, be sure that you have installed the following packages in your Linux system:

  • ros-<your_ros_version>-<your_gazebo_version>-ros. (in my case it is ros-jade-gazebo6-ros)
  • ros-<your_ros_version>-<your_gazebo_version>-plugins (in my case it is ros-jade-gazebo6-plugins)

This tutorial, has two parts: on the first one we will explain how to create the plugin, and on the second, how to test that it works

 

Creating the plugin

Creating a ROS package for the plugin

First thing, is to create the package in our catkin workspace that will allow us to compile the plugin without a problem.

cd ~/catkin_ws/src
catkin_create_pkg gazebo_light_sensor_plugin gazebo_ros gazebo_plugins roscpp

Creating the plugin code

For this purpose, since we are using a camera to capture the light, we are going to create a plugin class that inherits from the CameraPlugin. The code that follows has been created taking as guideline the code of the authentic gazebo ROS camera plugin.

Create a file called light_sensor_plugin.h inside the include directory of your package, including the following code:

#ifndef GAZEBO_ROS_LIGHT_SENSOR_HH
#define GAZEBO_ROS_LIGHT_SENSOR_HH

#include <string>

// library for processing camera data for gazebo / ros conversions
#include <gazebo/plugins/CameraPlugin.hh>

#include <gazebo_plugins/gazebo_ros_camera_utils.h>

namespace gazebo
{
  class GazeboRosLight : public CameraPlugin, GazeboRosCameraUtils
  {
    /// \brief Constructor
    /// \param parent The parent entity, must be a Model or a Sensor
    public: GazeboRosLight();

    /// \brief Destructor
    public: ~GazeboRosLight();

    /// \brief Load the plugin
    /// \param take in SDF root element
    public: void Load(sensors::SensorPtr _parent, sdf::ElementPtr _sdf);

    /// \brief Update the controller
    protected: virtual void OnNewFrame(const unsigned char *_image,
    unsigned int _width, unsigned int _height,
    unsigned int _depth, const std::string &_format);

    ros::NodeHandle _nh;
    ros::Publisher _sensorPublisher;

    double _fov;
    double _range;
  };
}
#endif

As you can see, the code includes a node handler to connect to the roscore. It also defines a publisher that will publish messages containing the illuminance value. Two parameters have been defined: fov (field of view) and range. At present only fov is used to indicate the amount of pixels around the center of the image that will be taken into account to calculate the illuminance.

Next step is to create a file named light_sensor_plugin.cpp containing the following code in the src directory of your package:

#include <gazebo/common/Plugin.hh>
#include <ros/ros.h>
#include "gazebo_light_sensor_plugin/light_sensor_plugin.h"

#include "gazebo_plugins/gazebo_ros_camera.h"

#include <string>

#include <gazebo/sensors/Sensor.hh>
#include <gazebo/sensors/CameraSensor.hh>
#include <gazebo/sensors/SensorTypes.hh>

#include <sensor_msgs/Illuminance.h>

namespace gazebo
{
  // Register this plugin with the simulator
  GZ_REGISTER_SENSOR_PLUGIN(GazeboRosLight)

  ////////////////////////////////////////////////////////////////////////////////
  // Constructor
  GazeboRosLight::GazeboRosLight():
  _nh("light_sensor_plugin"),
  _fov(6),
  _range(10)
  {
    _sensorPublisher = _nh.advertise<sensor_msgs::Illuminance>("lightSensor", 1);
  }

  ////////////////////////////////////////////////////////////////////////////////
  // Destructor
  GazeboRosLight::~GazeboRosLight()
  {
    ROS_DEBUG_STREAM_NAMED("camera","Unloaded");
  }

  void GazeboRosLight::Load(sensors::SensorPtr _parent, sdf::ElementPtr _sdf)
  {
    // Make sure the ROS node for Gazebo has already been initialized
    if (!ros::isInitialized())
    {
      ROS_FATAL_STREAM("A ROS node for Gazebo has not been initialized, unable to load plugin. "
        << "Load the Gazebo system plugin 'libgazebo_ros_api_plugin.so' in the gazebo_ros package)");
      return;
    }

    CameraPlugin::Load(_parent, _sdf);
    // copying from CameraPlugin into GazeboRosCameraUtils
    this->parentSensor_ = this->parentSensor;
    this->width_ = this->width;
    this->height_ = this->height;
    this->depth_ = this->depth;
    this->format_ = this->format;
    this->camera_ = this->camera;

    GazeboRosCameraUtils::Load(_parent, _sdf);
  }

  ////////////////////////////////////////////////////////////////////////////////
  // Update the controller
  void GazeboRosLight::OnNewFrame(const unsigned char *_image,
    unsigned int _width, unsigned int _height, unsigned int _depth,
    const std::string &_format)
  {
    static int seq=0;

    this->sensor_update_time_ = this->parentSensor_->GetLastUpdateTime();

    if (!this->parentSensor->IsActive())
    {
      if ((*this->image_connect_count_) > 0)
      // do this first so there's chance for sensor to run once after activated
        this->parentSensor->SetActive(true);
    }
    else
    {
      if ((*this->image_connect_count_) > 0)
      {
        common::Time cur_time = this->world_->GetSimTime();
        if (cur_time - this->last_update_time_ >= this->update_period_)
        {
          this->PutCameraData(_image);
          this->PublishCameraInfo();
          this->last_update_time_ = cur_time;

          sensor_msgs::Illuminance msg;
          msg.header.stamp = ros::Time::now();
          msg.header.frame_id = "";
          msg.header.seq = seq;

          int startingPix = _width * ( (int)(_height/2) - (int)( _fov/2)) - (int)(_fov/2);

          double illum = 0;
          for (int i=0; i<_fov ; ++i)
          {
            int index = startingPix + i*_width;
            for (int j=0; j<_fov ; ++j)
              illum += _image[index+j];
          }

          msg.illuminance = illum/(_fov*_fov);
          msg.variance = 0.0;

          _sensorPublisher.publish(msg);

          seq++;
        }
      }
    }
  }
}

That is the code that calculates the illuminance in a very simple way. Basically, it just adds the values of all the pixels in the fov of the camera and then divides by the total number of pixels.

Create a proper CMakeLists.txt

Substitute the code of the automatically created CMakeLists.txt by the code below:

cmake_minimum_required(VERSION 2.8.3)
project(gazebo_light_sensor_plugin)

find_package(catkin REQUIRED COMPONENTS
  gazebo_plugins
  gazebo_ros
  roscpp
)

find_package (gazebo REQUIRED)

catkin_package(
  INCLUDE_DIRS include
  CATKIN_DEPENDS gazebo_plugins gazebo_ros roscpp
)

###########
## Build ##
###########

set(CMAKE_CXX_FLAGS "-std=c++11 ${CMAKE_CXX_FLAGS}")

link_directories(${GAZEBO_LIBRARY_DIRS})
include_directories(include)
include_directories( ${catkin_INCLUDE_DIRS} 
                     ${Boost_INCLUDE_DIR} 
                     ${GAZEBO_INCLUDE_DIRS}
)

add_library(${PROJECT_NAME} src/light_sensor_plugin.cpp)

## Specify libraries to link a library or executable target against
target_link_libraries( ${PROJECT_NAME} ${catkin_LIBRARIES} ${GAZEBO_LIBRARIES} CameraPlugin )

Update the package.xml and compile

Now you need to include the following line in your package.xml, between the tags <export></export>

<gazebo_ros plugin_path="${prefix}/lib" gazebo_media_path="${prefix}" />

Now you are ready to compile the plugin. Compilation should generate the library containing the plugin inside your building directory.

> roscd
> cd ..
> catkin_make

Testing the Plugin

Let’s create a world file containing the plugin and launch it to see how it works

Create a world file

You need a world file that includes the plugin. Here it is an example. Create a worlds directory inside your plugin package, and save the following code in a file entitled light.world. This world file just loads the camera with its plugin so it may be a bit ugly but enough for your tests. Feel free to add more elements and models in the world file (like for example, in the picture at the top of this post).

<?xml version="1.0" ?>
<sdf version="1.4">
 <world name="default">
 <include>
   <uri>model://ground_plane</uri>
 </include>

 <include>
   <uri>model://sun</uri>
 </include>

 <!-- reference to your plugin -->
 <model name='camera'>
   <pose>0 -1 0.05 0 -0 0</pose>
   <link name='link'>
     <inertial>
       <mass>0.1</mass>
       <inertia>
         <ixx>1</ixx>
         <ixy>0</ixy>
         <ixz>0</ixz>
         <iyy>1</iyy>
         <iyz>0</iyz>
         <izz>1</izz>
       </inertia>
     </inertial>
     <collision name='collision'>
       <geometry>
         <box>
            <size>0.1 0.1 0.1</size>
         </box>
       </geometry>
       <max_contacts>10</max_contacts>
       <surface>
         <contact>
           <ode/>
         </contact>
         <bounce/>
         <friction>
           <ode/>
         </friction>
       </surface>
     </collision>
     <visual name='visual'>
       <geometry>
         <box>
           <size>0.1 0.1 0.1</size>
         </box>
       </geometry>
     </visual>
     <sensor name='camera' type='camera'>
       <camera name='__default__'>
         <horizontal_fov>1.047</horizontal_fov>
         
         <clip>
           <near>0.1</near>
           <far>100</far>
         </clip>
       </camera>
       <plugin name="gazebo_light_sensor_plugin" filename="libgazebo_light_sensor_plugin.so">
         <cameraName>camera</cameraName>
         <alwaysOn>true</alwaysOn>
         <updateRate>10</updateRate>
         <imageTopicName>rgb/image_raw</imageTopicName>
         <depthImageTopicName>depth/image_raw</depthImageTopicName>
         <pointCloudTopicName>depth/points</pointCloudTopicName>
         <cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName>
         <depthImageCameraInfoTopicName>depth/camera_info</depthImageCameraInfoTopicName>
         <frameName>camera_depth_optical_frame</frameName>
         <baseline>0.1</baseline>
         <distortion_k1>0.0</distortion_k1>
         <distortion_k2>0.0</distortion_k2>
         <distortion_k3>0.0</distortion_k3>
         <distortion_t1>0.0</distortion_t1>
         <distortion_t2>0.0</distortion_t2>
         <pointCloudCutoff>0.4</pointCloudCutoff>
         <robotNamespace>/</robotNamespace>
       </plugin>
     </sensor>
     <self_collide>0</self_collide>
     <kinematic>0</kinematic>
     <gravity>1</gravity>
   </link>
 </model>
 </world>
</sdf>

 

Create a launch file

Now the final step, to create a launch that will upload everything for you. Save the following code as main.launch inside the launch directory of you package.

<launch>
  <!-- We resume the logic in empty_world.launch, changing only the name of the world to be launched -->
  <include file="$(find gazebo_ros)/launch/empty_world.launch">
    <arg name="verbose" value="true"/>
    <arg name="world_name" value="$(find gazebo_light_sensor_plugin)/worlds/light.world"/>
    <!-- more default parameters can be changed here -->
  </include>
</launch>

 

Ready to run!

Now launch the world. Be sure that a roscore is running or your machine, and that the GAZEBO_PLUGIN_PATH environment var includes the path to the new plugin.

Now execute the following command:

roslaunch gazebo_light_sensor_plugin main.launch

You can see what the camera is observing by running the following command:

rosrun image_view image_view image:=/camera/rgb/image_raw

After running that command, a small window will appear in your screen showing what the camera is capturing. Of course, since your world is completely empty, you will only see something like as ugly as this:

Screenshot from 2016-02-11 13:07:19

Try to add some stuff in front of the camera and see how it is actually working.

Screenshot from 2015-05-20 17:34:57

Now it is time to check the value of illuminance by watching the published topic (/light_sensor_plugin/lightSensor). Just type the following and you are done:

rostopic echo /light_sensor_plugin/lightSensor

You should see the topic messages been published in your screen, something like this:

Screenshot from 2015-05-20 17:35:32

 

Conclusion

Now you have a plugin for your Gazebo simulations that can measure (very roughly) the light detected. It can be improved in many ways, but it serves as a starting point for understanding the complex world of plugins within Gazebo.

You can use it in you desktop Gazebo or even inside the ROS Development Studio. It is also independent of the ROS version you are using (just install the proper packages).

 

Do you have any interesting modifications for this plugin? What about computing variance? Or what about computing illuminance by raytracing to lights? Please share your mods here!

Pin It on Pinterest