In this video we will see how to launch a complex industrial environment with several robots in it, including ROS industrial robots and service robots.
The simulation contains a UR5 industrial robot and a couple of mobile bases. Also, many types of sensors include, including lasers cameras, and even a conveyor belt simulation.
This amazing simulation was created by the OSRF for their ARIAC competition 2017 using Gazebo simulator
[irp posts=”8409″ name=”RDP 006: Using ROS for Industrial Projects With Carlos Rosales”]
Step 1. Create a project in ROS Development Studio(ROSDS)
ROSDS helps you follow our tutorial in a fast pace without dealing without setting up an environment locally. If you haven’t had an account yet, you can create a free account here. You can get the shared project through this link .
Step 2. Run the simulation
We prebuilt the package for this project. You can run the simulation with the following command
In this video we will learn how to install the ROSBot Gazebo simulation in just 5 minutes and how to launch the mapping and navigation demos that it includes.
[irp posts=”8512″ name=”RDP 008: Building Accessible ROSbots With Jack Pien”]
RELATED LINKS
▸ Husarion web page: https://husarion.com/
▸ Husarion ROSbot simulation git: https://github.com/husarion/ROSbot_description
▸ Robot Ignite Academy: https://www.robotigniteacademy.com/
▸ ROS Navigation in 5 days online course: https://goo.gl/bbNXRH
▸ ROS Development Studio: https://www.theconstruct.ai/rds-ros-development-studio/
We show you how you can launch two drones (or more) in the same Gazebo simulation, each one having its own independent control system based on ROS. This procedure can be replicated to launch as many drones as required.
RELATED LINKS
– How to start programming drones with ROS: https://youtu.be/f7b5tSZW1Ig – Hector Quadrotor Simulation: https://bitbucket.org/theconstructcore/hector_quadrotor_sim – ROS Development Studio: https://goo.gl/Yf2Q4J
[irp posts=”6638″ name=”ROS Q&A | How to Start Programming Drones using ROS”]
In this video answer, we walk through the basics of a Parrot AR Drone Gazebo simulation. You will learn the topics provided by the simulation and how to use a ROS program to interact, sending commands or reading sensors, with this robot.
[irp posts=”8190″ name=”Performing LSD-SLAM with a ROS based Parrot AR. Drone”]
Let’s get started!
Step1. Create a new project on ROS Development Studio(RDS)
We’ll use ROS Development Studio(RDS) for this tutorial, you can register a free account here.
After logging into RDS, click on create my project. It will move to the public simulation. You can find tons of public simulation here offered by the construct for free and start to work on any of them in just minutes. See how powerful RDS is! For today, we’ll use sjtu_drone_tc project. Please click on it. Click the tools menu, you can find some tools that help you develop in RDS. For example, you have the:
Shell: it is the terminal where you can execute commands in RDS. You can open it as many as you want in RDS!
IDE: It’s the best way to explore the source tree of your project. With a right click, you can add or remove files easily.
Jupyter Notebook: You can take notes for your project here. Since it’s working with python shell, you can directly execute python script here. A default notebook is provided to help you start the simulation.
Graphical tool: You can use all the GUIs supported by ROS here(e.g. RViz, rqt_gui…etc.)
Notice:
We are not automatically running the simulation when you start RDS now. In order to have the same simulation shown in the video. Please go Simulations->Select launch file->main.launch to launch it by yourself. Then you can type the following command in a shell to check if the topics are correctly publishing by the drone.
$ rostopic list
Step2. Get started with the simulation
Let’s get started by following the instruction in the default jupyter notebook. Open it from tools->jupyter notebook->default.ipynb.
We can make the drone take off with the shell command
$ rostopic pub /drone/takeoff std_msgs/Empty "{}"
You should see the drone take off as soon as you send this command.
Notice:
You can use the ROS auto-completion function while you are typing a ROS command by pressing [TAB]. It’s a good idea to do that when the command is too long and hard to type it correctly.
You can also land the drone with the following command
$ rostopic pub /drone/land std_msgs/Empty "{}"
You can also find an instruction in the default.ipynb shows you how to do it with a python script instead of sending commend from shell.
Step3. Program with drones
We have more examples for you! Let’s say, we want to use the position control function provided by the drone. We found there is a topic called /drone/posctrl, but how to use it? By typing
It seems that the topic is using the Bool message, but what is Bool message and how can I use it? You can further investigate it by typing
rosmsg show std_msgs/Bool
and got the output
bool data
The Bool message is very simple. It contains only one attribute called data with the type bool. Let’s try to send a message to this topic! Before we publish to the topic, we set up a monitor first with
rostopic echo /drone/posctrl
Then copy, paste and execute the following code in jupyter notebook.
which means the message is published correctly. We enable the position control function on the drone successfully. Similarly, you can move the drone by publishing Twist message to the /cmd_vel topic. Here is an example script
Now you know how to start programming drones easily with RDS! You can do lots of things in RDS(e.g. using the cameras on the drone to implement computer vision algorithms). If you are interested but have no ideas how to do it, you can check our Programming drones with ROS course on robot ignite academy!
There are magnificent tutorials about how to create plugins for Gazebo in the GazeboSim webpage. There are even some tutorials about how to create plugins for Gazebo + ROS. Those tutorials show that there are several types of plugins (world, model, sensor, system, visual), and indicate how to create a plugin for a world type plugin.
Recently I need to create a plugin for a light detector. Reading the tutorials, I missed a concrete example abouthow to create a sensor plugin. Hence, I had to investigate a little bit about it. The result is the content of this post.
How to: light sensor plugin in Gazebo
Following the indications provided at the answers forum of Gazebo, I decided to build a very simple light detector sensor based on a camera. Instead of using a raytracing algorithm from lights, the idea is to use a camera to capture an image, then use the image to calculate the illuminance of the image, and then publish that illuminance value through a ROS topic.
Since the plugin is for its use with ROS, the whole plugin should be compilable using a ROS environment. Hence, be sure that you have installed the following packages in your Linux system:
ros-<your_ros_version>-<your_gazebo_version>-ros. (in my case it is ros-jade-gazebo6-ros)
ros-<your_ros_version>-<your_gazebo_version>-plugins (in my case it is ros-jade-gazebo6-plugins)
This tutorial, has two parts: on the first one we will explain how to create the plugin, and on the second, how to test that it works
Creating the plugin
Creating a ROS package for the plugin
First thing, is to create the package in our catkin workspace that will allow us to compile the plugin without a problem.
cd ~/catkin_ws/src
catkin_create_pkg gazebo_light_sensor_plugin gazebo_ros gazebo_plugins roscpp
Creating the plugin code
For this purpose, since we are using a camera to capture the light, we are going to create a plugin class that inherits from the CameraPlugin. The code that follows has been created taking as guideline the code of the authentic gazebo ROS camera plugin.
Create a file called light_sensor_plugin.h inside the include directory of your package, including the following code:
#ifndef GAZEBO_ROS_LIGHT_SENSOR_HH
#define GAZEBO_ROS_LIGHT_SENSOR_HH
#include <string>
// library for processing camera data for gazebo / ros conversions
#include <gazebo/plugins/CameraPlugin.hh>
#include <gazebo_plugins/gazebo_ros_camera_utils.h>
namespace gazebo
{
class GazeboRosLight : public CameraPlugin, GazeboRosCameraUtils
{
/// \brief Constructor
/// \param parent The parent entity, must be a Model or a Sensor
public: GazeboRosLight();
/// \brief Destructor
public: ~GazeboRosLight();
/// \brief Load the plugin
/// \param take in SDF root element
public: void Load(sensors::SensorPtr _parent, sdf::ElementPtr _sdf);
/// \brief Update the controller
protected: virtual void OnNewFrame(const unsigned char *_image,
unsigned int _width, unsigned int _height,
unsigned int _depth, const std::string &_format);
ros::NodeHandle _nh;
ros::Publisher _sensorPublisher;
double _fov;
double _range;
};
}
#endif
As you can see, the code includes a node handler to connect to the roscore. It also defines a publisher that will publish messages containing the illuminance value. Two parameters have been defined: fov (field of view) and range. At present only fov is used to indicate the amount of pixels around the center of the image that will be taken into account to calculate the illuminance.
Next step is to create a file named light_sensor_plugin.cpp containing the following code in the src directory of your package:
#include <gazebo/common/Plugin.hh>
#include <ros/ros.h>
#include "gazebo_light_sensor_plugin/light_sensor_plugin.h"
#include "gazebo_plugins/gazebo_ros_camera.h"
#include <string>
#include <gazebo/sensors/Sensor.hh>
#include <gazebo/sensors/CameraSensor.hh>
#include <gazebo/sensors/SensorTypes.hh>
#include <sensor_msgs/Illuminance.h>
namespace gazebo
{
// Register this plugin with the simulator
GZ_REGISTER_SENSOR_PLUGIN(GazeboRosLight)
////////////////////////////////////////////////////////////////////////////////
// Constructor
GazeboRosLight::GazeboRosLight():
_nh("light_sensor_plugin"),
_fov(6),
_range(10)
{
_sensorPublisher = _nh.advertise<sensor_msgs::Illuminance>("lightSensor", 1);
}
////////////////////////////////////////////////////////////////////////////////
// Destructor
GazeboRosLight::~GazeboRosLight()
{
ROS_DEBUG_STREAM_NAMED("camera","Unloaded");
}
void GazeboRosLight::Load(sensors::SensorPtr _parent, sdf::ElementPtr _sdf)
{
// Make sure the ROS node for Gazebo has already been initialized
if (!ros::isInitialized())
{
ROS_FATAL_STREAM("A ROS node for Gazebo has not been initialized, unable to load plugin. "
<< "Load the Gazebo system plugin 'libgazebo_ros_api_plugin.so' in the gazebo_ros package)");
return;
}
CameraPlugin::Load(_parent, _sdf);
// copying from CameraPlugin into GazeboRosCameraUtils
this->parentSensor_ = this->parentSensor;
this->width_ = this->width;
this->height_ = this->height;
this->depth_ = this->depth;
this->format_ = this->format;
this->camera_ = this->camera;
GazeboRosCameraUtils::Load(_parent, _sdf);
}
////////////////////////////////////////////////////////////////////////////////
// Update the controller
void GazeboRosLight::OnNewFrame(const unsigned char *_image,
unsigned int _width, unsigned int _height, unsigned int _depth,
const std::string &_format)
{
static int seq=0;
this->sensor_update_time_ = this->parentSensor_->GetLastUpdateTime();
if (!this->parentSensor->IsActive())
{
if ((*this->image_connect_count_) > 0)
// do this first so there's chance for sensor to run once after activated
this->parentSensor->SetActive(true);
}
else
{
if ((*this->image_connect_count_) > 0)
{
common::Time cur_time = this->world_->GetSimTime();
if (cur_time - this->last_update_time_ >= this->update_period_)
{
this->PutCameraData(_image);
this->PublishCameraInfo();
this->last_update_time_ = cur_time;
sensor_msgs::Illuminance msg;
msg.header.stamp = ros::Time::now();
msg.header.frame_id = "";
msg.header.seq = seq;
int startingPix = _width * ( (int)(_height/2) - (int)( _fov/2)) - (int)(_fov/2);
double illum = 0;
for (int i=0; i<_fov ; ++i)
{
int index = startingPix + i*_width;
for (int j=0; j<_fov ; ++j)
illum += _image[index+j];
}
msg.illuminance = illum/(_fov*_fov);
msg.variance = 0.0;
_sensorPublisher.publish(msg);
seq++;
}
}
}
}
}
That is the code that calculates the illuminance in a very simple way. Basically, it just adds the values of all the pixels in the fov of the camera and then divides by the total number of pixels.
Create a proper CMakeLists.txt
Substitute the code of the automatically created CMakeLists.txt by the code below:
cmake_minimum_required(VERSION 2.8.3)
project(gazebo_light_sensor_plugin)
find_package(catkin REQUIRED COMPONENTS
gazebo_plugins
gazebo_ros
roscpp
)
find_package (gazebo REQUIRED)
catkin_package(
INCLUDE_DIRS include
CATKIN_DEPENDS gazebo_plugins gazebo_ros roscpp
)
###########
## Build ##
###########
set(CMAKE_CXX_FLAGS "-std=c++11 ${CMAKE_CXX_FLAGS}")
link_directories(${GAZEBO_LIBRARY_DIRS})
include_directories(include)
include_directories( ${catkin_INCLUDE_DIRS}
${Boost_INCLUDE_DIR}
${GAZEBO_INCLUDE_DIRS}
)
add_library(${PROJECT_NAME} src/light_sensor_plugin.cpp)
## Specify libraries to link a library or executable target against
target_link_libraries( ${PROJECT_NAME} ${catkin_LIBRARIES} ${GAZEBO_LIBRARIES} CameraPlugin )
Update the package.xml and compile
Now you need to include the following line in your package.xml, between the tags <export></export>
Now you are ready to compile the plugin. Compilation should generate the library containing the plugin inside your building directory.
> roscd
> cd ..
> catkin_make
Testing the Plugin
Let’s create a world file containing the plugin and launch it to see how it works
Create a world file
You need a world file that includes the plugin. Here it is an example. Create a worlds directory inside your plugin package, and save the following code in a file entitled light.world. This world file just loads the camera with its plugin so it may be a bit ugly but enough for your tests. Feel free to add more elements and models in the world file (like for example, in the picture at the top of this post).
Now the final step, to create a launch that will upload everything for you. Save the following code as main.launch inside the launch directory of you package.
<launch>
<!-- We resume the logic in empty_world.launch, changing only the name of the world to be launched -->
<include file="$(find gazebo_ros)/launch/empty_world.launch">
<arg name="verbose" value="true"/>
<arg name="world_name" value="$(find gazebo_light_sensor_plugin)/worlds/light.world"/>
<!-- more default parameters can be changed here -->
</include>
</launch>
Ready to run!
Now launch the world. Be sure that a roscore is running or your machine, and that the GAZEBO_PLUGIN_PATH environment var includes the path to the new plugin.
Now execute the following command:
roslaunch gazebo_light_sensor_plugin main.launch
You can see what the camera is observing by running the following command:
After running that command, a small window will appear in your screen showing what the camera is capturing. Of course, since your world is completely empty, you will only see something like as ugly as this:
Try to add some stuff in front of the camera and see how it is actually working.
Now it is time to check the value of illuminance by watching the published topic (/light_sensor_plugin/lightSensor). Just type the following and you are done:
rostopic echo /light_sensor_plugin/lightSensor
You should see the topic messages been published in your screen, something like this:
Conclusion
Now you have a plugin for your Gazebo simulations that can measure (very roughly) the light detected. It can be improved in many ways, but it serves as a starting point for understanding the complex world of plugins within Gazebo.
You can use it in you desktop Gazebo or even inside theROS Development Studio. It is also independent of the ROS version you are using (just install the proper packages).
Do you have any interesting modifications for this plugin? What about computing variance? Or what about computing illuminance by raytracing to lights? Please share your mods here!