ROS is becoming the standard in robotics, not only for robotics research, but also for robotics companies that build and sell robots. In this article, I will offer a list of the top 10 worldwide robotics companies that base their robotics products on ROS.
Criteria
This is the list of criteria I followed to select the winners:
We are talking about roboticscompanies that build robots here. This is not about companies that produce some kind of software based in ROS, but companies that create and ship robots based in ROS. We do not consider companies that do consulting and generate solutions for a third party either.
They have created the robots themselves. This means they are not re-sellers or distributors of robots made by somebody else.
They have their robots natively running ROS. This means, you switch the robot on and it is running ROS. We are not taking into account robots that support ROS (if you install the packages). We concentrate on robots that run ROS off-the-shelf. For example, you can run ROS on a UR5 arm robot, but if you buy the UR5 robot, it will not come with ROS support off-the-shelf. You need to add an extra layer of work. We are not considering those robots.
You can program the robots. Even if some companies provide ROS based robots (like, for example, Locus Robotics) they do not provide a way to program them. They provide the robots as a closed solution. We are not considering here closed solutions.
Summarizing, the criteria are: one, you can buy the robot directly from the company; two, the robot runs ROS from minute one; and three, you can program the robot at will.
Once the companies were selected based on the previous criteria, I had to decide the order. Order was based on my personal perception of the impact those companies are making in the ROS world. That is very subjective to my own experience, I know, but that is what it is. Whenever I felt it necessary, I described my motivation behind the position of the company on the list.
Now, having clarified all that, let’s go to the list!
Clearpath is a Canadian company founded in 2009. The number of robots that they produce is amazing in the fields of Unmanned Ground Vehicles, Unmanned Surface Vehicles (water vehicles), and industrial vehicles. Their robots are based on ROS and can be programmed with ROS from minute one. That is why their robots are used in the creation of third-party applications for mining, survey, inspection, and agriculture or material handling.
As a matter of trustability, this company took the responsibility to provide the customer support to the existing PR2 robots, once Willow Garage closed its doors. Because of that, and because it is the company with the most varied ROS robots available, I put it in the well-deserved number 1 spot on the list.
Fetch Robotics was founded by Melonee Wise in 2014, after she was forced to close her previous pioneer company, Unbounded Robotics. We can say that Fetch has two lines of business: first, the line of mobile manipulators, which are mainly used for robotics research; then, a line of industrial robots where they sell fleets of robots ready to be deployed in a warehouse to help with the transport of materials. As I understand it, the first line of business is the only one that allows direct ROS programming, the second one being a closed product.
I did not select Fetch for number 2 because of its research line only. I selected it for the number 2 spot because Fetch was a pioneer in the creation of affordable mobile manipulators (with their Fetch robot). Up to the moment they released Fetch, there was no ROS based mobile manipulator on the market (sorry, Turtlebot 2 with a dynamixel arm doesn’t count as a mobile manipulator).
Recently, Fetch organized the FetchIt! challenge at ICRA 2019 (and our company, The Construct, was a partner contributing to the simulation of the event). At that event, participants had to program their Fetch to produce some pieces in a manufacturing room. You can check the results here.
Even if Fetch Robotics only produces two robots with the criteria above (the Fetch and Freight robots), they were the pioneers that opened the field of ROS based mobile manipulators. That is why they deserve the number 2 spot on this list.
Pal Robotics is a Spanish company based in Barcelona and created in 2004 . I especially love this company because I worked there for more than 7 years and many of my friends are there (I worked there from 2007 until 2014). But love is not the reason I put them in the third position. They have a well-deserved 3rd position because they are the only company in the world that builds and sells human-size humanoid robots. And not just a single type of humanoid, but three different types! The Reem robot, Reem-C robot, and recently, the TALOS robot.
Pal also produces mobile manipulators similar to the Fetch ones. Their manipulators are called Tiago, and you can buy them for your research or applications on top (if you’re interested, you can learn how to program Tiago robots with ROS here, an online course that we created in collaboration with Pal Robotics).
We have recently released a simulation of their latest robot TALOS including the walking controllers. You can get it here.
Robotnik is another Spanish company, based in Castellon and founded in 2002 . I call them the Spanish Clearpath. Really, they build as many ROS robots as the first company on this list. They create and design mobile manipulators, Unmanned Ground Vehicles of different types, and many types of mobile robots for industrial applications and logistics. They also are experts in personalizing your required ROS robot by integrating third party robotics parts into a final ROS based robot that meets your requirements. Finally, they are the people behind the ROS Components online shop, where you can buy components for your robots that are certified to be ROS supported off-the-shelf. For all this extensive activity in selling ROS robots, Robotnik deserves the fourth position on our list.
Yujin is a Korean company specializing in vacuum cleaning robots. However, their vacuum cleaning robots are not the reason they are on this list since they do not run ROS onboard. Instead, they are here because they are the official sellers of the Kobuki robot (that is the base system of the Turtlebot 2 robot). The Turtlebot 2 is the most famous ROS robot in the world, even more so than the PR2! Almost every one of us has learned with that robot, either in simulation or in real. Due to its low cost, it allows you to easily enter into the ROS world. If you have bought a Turtlebot 2 robot, it is very likely that the base was made by Yujin. We used Kobuki as the base of our robot Barista, and I use several of them at my ROS class at La Salle University.
Additionally, Yujin has developed another ROS robot for logistics that is called GoCart, a very interesting robot for logistics inside buildings (not for warehouses), where the robot can be used to send packages from one location in the building to another (including elevators on the path).
This is another Korean company that is making it big in the ROS world. Even if Robotis is well known for its Dynamixel servos, they are best known in the ROS world because of their Turtlebot 3 robot and Open manipulator, both presented as the next generation of the Turtlebot series. With the development of the Turtlebot 3, they brought the Turtlebot concept to another level, allowing people easier entry into ROS. Their manipulator is also very well integrated with the Turtlebot 3, so you can have a complete mobile manipulator for a few hundred dollars. Even easier, they have released open source for all the designs of both robots, so you can build the robots yourself.
Shadow Robot is a British company based in London. This company is a pioneer in the development of humanoid robotic hands. To my knowledge, they are the only company in the world that sells that kind of robotic hand. Furthermore, their hands are ROS programmable off-the-shelf. Apart from hands, they also produce many other types of grippers for robots, which can be mounted on robotic arms to create complete grasping solutions. One of their solutions combined with third party robots was the Smart Grasping System released in 2016, combining a three fingers gripper with a UR5 robot (here is a simulation we created of the Smart Grasping System, in collaboration with Ugo Cupcic).
Husarion is a Polish company founded in 2013. It sells simple and compact autonomous mobile robots called ROSbots. The ROSbots are small, four-wheeled robots equipped with a lidar, camera, and a point cloud device. They are the perfect robot for learning ROS with a real robot, or for doing research and learning with a more compact robot than the Turtlebot 2. They also produce the Panther robot, which is more oriented to outdoor environments, but with the same purpose of research and learning.
What makes Husarion special from other companies selling ROS robots is the compactness of their robots and the Husarnet network they have created, which connects the robots through the cloud and has remote control over them.
Neobotix is a manufacturer of mobile robots, and robot systems in general. They have robots and manipulators for a wide range of industrial applications, especially in the sector of transporting material. Neobotix is a spin-off of the Fraunhofer Institute in Stuttgart, and they are the creators of the famous Care-O-Bot, many times used in the Robocup@Home competitions. However, as far as I know, the Care-O-Bot never reached the point of product (I mean, you can order 5 of them and get them delivered, running from minute one after unpacking).
They are concentrating at present on selling mobile bases, which can be customized with robotics arms, converting the whole system in a custom mobile manipulator. They also sell the mobile bases and the manipulators separately. Examples of mobile bases are their MP series of robots. On the mobile manipulators side, they have the MM series. All of them work off-the-shelf with ROS.
Even if their products are full products on their own, I see them more as components that we can use for building more complex robots, allowing us to save time creating all the parts. That is why I have decided to put it in the 9th position and not above the other products.
Gaitech is a Chinese company that is mainly dedicated to distributing ROS robots, and ROS products in general, in China from third party companies (from many of the companies on this list, including Fetch, Pal, and Robotnik, among others). However, they also have their own line of development in which they develop their own robots; for example, the Gapter drone, the only drone I’m aware of that works with ROS off-the-shelf.
Even if their robots are not very popular in the ROS circuit, I have included them here because at present, they are the only company in the world that is building ROS based drones (Erle Robotics did ROS based drones in the past, but as far as I know, they ceased that activity when they switched to Acutronic Robotics). Due to this lack of competition, I think they deserve the position number 10.
The following is a shortlist of other companies building ROS robots that did not make it onto the list for certain reasons. They may be here next year!
1. Sony
Sony is a complete newcomer to the world of ROS robots, but it has entered through the big door. Last year (2018), they announced the release of the Aibo robot dog, which fully works on ROS. That was a big surprise to all of us, especially since Sony abandoned the Aibo project back in 2005.
Their robot could have put them onto the list above, except for the fact that the robot is still too new and can only be bought in the USA and Japan. Furthermore, the robot still has a very limited programming SDK (you can barely program it).
If you are interested in the inner workings of Aibo with ROS, have a look at the presentation by Tomoya Fujita, one of the engineers of the project, during the ROS Developers Conference 2019, where he explained about the communication mechanism between processes that they had to develop for ROS in order to reduce battery consumption in Aibo. Amazing stuff, fully compatible with ROS nodes and using the standard communication protocol!
This is a company based on selling simple mobile bases based on ROS for the development of third party solutions, or as they call it, robot applications. Their goal is to provide a solid mobile base, with navigation running off-the-shelf, on top of what you should build other solutions like telepresence, robo waiters, and so on. It is a young company with a good idea in mind, but still too close to already existing solutions like Neobotix or Robotnik. Let’s see next year how they have evolved.
They started building ROS based drones, but recently, they changed direction to produce Hardware ROSmicrochips. They produce the MARA robot, an industrial arm based on ROS2 on the H-ROS microchips. However, as far as I know, the MARA robot is not their main business, since they created it and sell it as an example of what can be done with H-ROS. That is why I decided not to include this company in the main top 10 list.
By the way, we also collaborated with Acutronic to create a series of videos about how to learn ROS2 using their MARA robot. Check them out here.
Most of the ROS based robotics companies are based on wheeled robots. A few exceptions are the humanoid robots of Pal Robotics, the drones of Gaitech, the robotic hands from Shadow Robots, and the arm robots from Neobotix.
It’s very interesting that we see almost no drones and no robotic arms running ROS off-the-shelf, being as both of them are very basic types of robots. I know there are many robotic arm companies that provide ROS drivers for their robots and many packages for their control, like the aforementioned Universal Robots or Kinova, to name a couple. However, on the list, only Neobotix actually provides an off-the-shelf arm robot with their MM series. What I think is that there is a lot of market space there for new ROS based drones and robotic arms. Take note of that, entrepreneurs of the world!
Finally, I would like to conclude that I do not know all the ROS companies out there. Even if I have done my research to create this article, it may have happened that I forgot some company worth mentioning. Let me know if you know of or have a company that sells ROS robots and should be on this list, so I can update it and correct any mistakes.
Photo at the top of the article: Robot humanoid Reem-C. (Photo: PAL Robotics)
Open the file in the editor and start coding. We start from the necessary libraries:
#! /usr/bin/env python
# import ros stuff
import rospy
# import ros message
from geometry_msgs.msg import Point
from sensor_msgs.msg import LaserScan
from nav_msgs.msg import Odometry
from tf import transformations
# import ros service
from std_srvs.srv import *
import math
Now.. we define some global variables to store the goal and the current status of the robot and algorithm:
srv_client_go_to_point_ = None
srv_client_wall_follower_ = None
yaw_ = 0
yaw_error_allowed_ = 5 * (math.pi / 180) # 5 degrees
position_ = Point()
desired_position_ = Point()
desired_position_.x = rospy.get_param('des_pos_x')
desired_position_.y = rospy.get_param('des_pos_y')
desired_position_.z = 0
regions_ = None
state_desc_ = ['Go to point', 'circumnavigate obstacle', 'go to closest point']
state_ = 0
circumnavigate_starting_point_ = Point()
circumnavigate_closest_point_ = Point()
count_state_time_ = 0 # seconds the robot is in a state
count_loop_ = 0
# 0 - go to point
# 1 - circumnavigate
# 2 - go to closest point
Take a closer look.. You’ll notice there are some new variables (in comparison to Bug 0 alg.) to store, not only the state, but also some necessary points (starting point of circumnavigation state and closest point to the goal).
Step 2 – Defining callbacks
Second step, we define the callbacks, afterall we need to read the position (given by odometry) and the obstacles (laser reading). They go like:
There is nothing new here, if you compare to the Bug 0 we have implemented before!
Step 3 – Helper functions
This part doesn’t contain rules of the robot behavior, but it’s important to help other functions to work properly (and clearer), since we are helping them to get values and assigning states in a simpler way.
The first function is change_state, we had something like this one to other algorithms:
def change_state(state):
global state_, state_desc_
global srv_client_wall_follower_, srv_client_go_to_point_
global count_state_time_
count_state_time_ = 0
state_ = state
log = "state changed: %s" % state_desc_[state]
rospy.loginfo(log)
if state_ == 0:
resp = srv_client_go_to_point_(True)
resp = srv_client_wall_follower_(False)
if state_ == 1:
resp = srv_client_go_to_point_(False)
resp = srv_client_wall_follower_(True)
if state_ == 2:
resp = srv_client_go_to_point_(False)
resp = srv_client_wall_follower_(True)
Then, we create the function calc_dist_points. It’s used to know the distance from the current position of the robot to interesting points (e.g: entry point of circumnavigation state, goal, etc.). It is quite simple, but we don’t want to have mathematical functions in our logics:
Let’s go to the main function, where we have it all working together.
Part 1 – Defining ROS node, callbacks, services and the initial state
def main():
global regions_, position_, desired_position_, state_, yaw_, yaw_error_allowed_
global srv_client_go_to_point_, srv_client_wall_follower_
global circumnavigate_closest_point_, circumnavigate_starting_point_
global count_loop_, count_state_time_
rospy.init_node('bug1')
sub_laser = rospy.Subscriber('/m2wr/laser/scan', LaserScan, clbk_laser)
sub_odom = rospy.Subscriber('/odom', Odometry, clbk_odom)
rospy.wait_for_service('/go_to_point_switch')
rospy.wait_for_service('/wall_follower_switch')
srv_client_go_to_point_ = rospy.ServiceProxy('/go_to_point_switch', SetBool)
srv_client_wall_follower_ = rospy.ServiceProxy('/wall_follower_switch', SetBool)
# initialize going to the point
change_state(0)
rate_hz = 20
rate = rospy.Rate(rate_hz)
And finally, our loop control:
while not rospy.is_shutdown():
if regions_ == None:
continue
if state_ == 0:
if regions_['front'] > 0.15 and regions_['front'] < 1:
circumnavigate_closest_point_ = position_
circumnavigate_starting_point_ = position_
change_state(1)
elif state_ == 1:
# if current position is closer to the goal than the previous closest_position, assign current position to closest_point
if calc_dist_points(position_, desired_position_) < calc_dist_points(circumnavigate_closest_point_, desired_position_):
circumnavigate_closest_point_ = position_
# compare only after 5 seconds - need some time to get out of starting_point
# if robot reaches (is close to) starting point
if count_state_time_ > 5 and \
calc_dist_points(position_, circumnavigate_starting_point_) < 0.2:
change_state(2)
elif state_ == 2:
# if robot reaches (is close to) closest point
if calc_dist_points(position_, circumnavigate_closest_point_) < 0.2:
change_state(0)
count_loop_ = count_loop_ + 1
if count_loop_ == 20:
count_state_time_ = count_state_time_ + 1
count_loop_ = 0
rate.sleep()
Depending on the state, we use one rule or another to do a transition.
For example, from state 0 – go to point, if there’s an obstacle ahead, we change to state 1 – circumnavigate obstacle
Step 5 – Launch it!
Let’s create a launch file and see it working!
Create a new file at ~/catkin_ws/src/motion_plan/launch/bug1.launch. The content is:
1 – You must launch the simulation like we did in previous chapters. Go to the simulation launcher and choose the world configured in our package:
2 – Spawn the robot
roslaunch m2wr_description spawn.launch y:=8
3 – Launch the algorithm
roslaunch motion_plan bug1.launch
Step 6 – Conclusion
Remember! If in any of the steps above you have missed something, you can always clone our ROSJect! Use this link to get your copy: http://www.rosject.io/l/b3a5b3c/
If you like it or have suggestion to improve it, leave a comment!
ROS (Robot Operating System) is now very popular among roboticists. Researchers, hobbyists, and even robotics companies are using it, promoting it and supporting it. However, it was not always like that. Do you know ROS history? In the early days, ROS was an unknown only used by a bunch of robotics freaks. How did ROS reach its current state as a robotics standard? Let’s see in this article how ROS got to its current superior status.
The Stanford Period
ROS started as a personal project of Keenan Wyrobek and Eric Berger while at Stanford, as an attempt to remove the reinventing the wheel situation from which robotics was suffering. Those two guys were worried about the most common problem of robotics at the time:
too much time dedicated to re-implementing the software infrastructure required to build complex robotics algorithms (basically, drivers to the sensors and actuators, and communications between different programs inside the same robot)
too little time dedicated to actually building intelligent robotics programs that were based on that infrastructure.
Even inside the same organization, the re-invention of the drivers and communication systems was re-implemented for each new project. This situation was beautifully expressed by Keenan and Eric in one of their slides used to pitch investors.
Most of the time spent in robotics was reinventing the wheel (slide from Eric and Keenan pitch deck)
In order to attack that problem, Eric and Keenan created in 2006 a program at Stanford called the Stanford Personal Robotics Program, with the aim to build a framework that allowed processes to communicate with each other, plus some tools to help create code on top of that. All that framework was supposed to be used to create code for a robot they also would build, the Personal Robot, as a testbed and example to others. They would build 10 of those robots and provide them to universities so that they could develop software based on their framework.
NOTE: People more versed in ROS will recognize in those the precursors of ROS-comm libraries and the Rviz, rqt_tools and the like of current modern ROS distributions. Also, the Personal Robot was the precursor of the famous PR2 robot.
PR1 robot, picture by IEEE Spectrum
Similar frameworks at the time
The idea of such a system for robotics was not new. Actually, there were some other related projects already available for the robotics community, Player/Stage, one of the most famous in the line of open source, and URBI in the line of proprietary systems. Even Open-R, the system developed by Sony which powered the early Aibo robots of 1999, was a system created to prevent that problem (a shame that Sony canceled that project, as they could have become the leaders by now. Ironically, this year Sony launched a new version of the Aibo robot… which runs ROS inside!). Finally, another similar system developed in Europe was YARP.
Actually, one of the leaders of the Player/Stage research project was Brian Gerkey, who later went to Willow Garage to develop ROS, and is now the CEO of Open Robotics, the company behind the development of ROS at present. On its side, URBI was a professional system led by Jean-Christoph Baillie, which worked very well, but could not compete with the free-ness of ROS.
The fastest readers will jump to the point that URBI was not free. Actually, it was quite expensive. Was the price what killed URBI? I don’t think so. In my opinion, what killed URBI was the lack of community. It takes some time to build a community, but once you have it, it acts like changing gears. URBI could not build a community because it relied on an (expensive) paid fee. That made it so that only people that could buy it were accessing the framework. That limits the amount of community you can create. It is true that ROS was free. But that is not the reason (many products that are free fail). The reason is that they built a community. Being free was just a strategy to build that community.
Switching gears
While at Stanford, Keenan and Eric received $50k of funding and used it to build a PR robot and a demo of what their actual project was. However, they realized that in order to build a really universal system and to provide those robots to the research groups, they would need additional funding. So they started to pitch investors.
At some point around 2008, Keenan and Eric met with Scott Hassan, investor and the founder of Willow Garage, a research center with a focus on robotics products. Scott found their idea so interesting that he decided to fund it and start a Personal Robotics Program inside Willow Garage with them. The Robot Operating System was born and the PR2 robot with it. Actually, the ROS project became so important that all the other projects of Willow Garage were discarded and Willow Garage concentrated only on the development and spread of ROS.
Willow Garage takes the lead
ROS was developed at Willow Garage for around 6 years, until Willow shut down, back in 2014. During that time, many advancements in the project were made. It was this push during the Willow time that skyrocketed its popularity. It was also during that time that I acknowledged its existence (I started with ROS C-turtle in 2010) and decided to switch from Player/Stage (the framework I was using at that time) to ROS, even if I was in love with Player/Stage (I don’t miss you because ROS is so much better in all aspects… sorry, Player/Stage, it is not me, it is you).
In 2009, the first distribution of ROS was released: ROS Mango Tango, also called ROS 0.4. As you can see, the name of the first release had nothing to do with the current naming convention (for unknown reasons to this author). The release 1.0 of that distribution was launched almost a year later in 2010. From that point, the ROS team decided to name of the distributions after turtle types. Hence, the following distributions and release dates were done:
In 2009, they built a second version of the Personal Robot, the PR2
In xxx, they launched ROS Answers, the channel to answer technical questions about ROS.
The first edition of the ROSCON was in 2012. The ROSCON became the official yearly conference for ROS developers.
In 2010, they built 11 PR2 robots and provided them to 11 universities for robotics software development using ROS (as the original idea of Eric and Keenan). At that point, the PR2 robot was for sale, so anybody in the world could buy one (if enough money was available ;-)).
Simulation started to become very important. More precisely, 3D simulation. That is why the team decided to incorporate Gazebo, the 3D robotics simulator from the Player/Stage project, into ROS. Gazebo became the default 3D simulator for ROS.
As ROS was evolving, all the metrics of ROS were skyrocketing. The number of repositories, the number of packages provided, and of course, the number of universities using it and of companies putting it into their products.
Evolution of ROS in the early days (picture from Willow Garage)
Another important event that increased the size of the ROS community was that, in 2011 Willow Garage announced the release of Turtlebot robot, the most famous robot for ROS developers. Even if PR2 was the intended robot for testing and developing with ROS, its complexity and high price made it non-viable for most researchers. Instead, the Turtlebot was a simple and cheap robot that allowed anybody to experiment with the basics of robotics and ROS. It quickly became a big hit, and is used even today, in its Turtlebot2 and Turtlebot 3 versions.
I remember when we received the news that Willow Garage was closing. I was working at that time at Pal Robotics. We at Pal Robotics were all very worried. What would happen with ROS? After all, we had changed a lot of our code to be working with ROS. We removed previous libraries like Karto for navigation (Karto is software for robot navigation, which at present is free, but at that time, we had to pay for a license to use it as the main SLAM and path planning algorithms of our robots).
The idea was that the newly created Open Source Robotics Foundation would take the lead of ROS development. And many of the employees were absorbed by Suitable Technologies (one of the spin-offs created from Willow Garage, which ironically does not use ROS for their products ;-)). The customer support for all the PR2 robots was absorbed by another important company, Clearpath Robotics.
Under the Open Source Robotics Foundation umbrella
Under the new legal structure of the OSRF, ROS continued to develop and release new distributions.
The reports created after each year are publicly available here under the tag ROS Metrics.
Having reached this point, it is important to state that the last distribution of ROS will be released next year in 2020. It is called ROS Noetic, which will be based on Python 3, instead of Python 2 as all the previous ones were. From that date, no more ROS 1 distributions will be released, and the full development will be taken for ROS 2.
What is ROS 2? Let’s dig in…
ROS 2.0
Around 2015, the deficiencies of ROS for commercial products were manifesting very clearly. Single point of failure (the roscore), lack of security, and no real-time support were some of the main deficiencies that companies argued for not supporting ROS in their products. However, it was clear that, if ROS had to become the standard for robotics, it had to reach the industrial sector with a stronger voice than that of the few pioneer companies already shipping ROS in their products.
In order to overcome that point, the OSRF took the effort to create ROS 2.0.
ROS 2.0 has already reached its fourth distribution in June this year with the release of Dashing Diademata.
Recent movements in the ROS ecosystem
In 2017, the Open Source Robotics Foundation changed its name to Open Robotics, in order to become more of a company than a foundation, even if the foundation branch still exists (some explanation about it can be found in this post and in this interview of Tully Foote).
Recently, Open Robotics has opened a new facility in Singapore and established a collaboration with the government there for development.
Local ROS conferences have been launched:
ROSCON France
ROSCON Japan
In the last months, big players like Amazon, Google, and Microsoft have started to show interest in the system, and show support for ROS.
That is definitely a sign that ROS is in good health (I would say in better health than ever) and that it has a bright future in front of it. Sure, many problems will arise (like, for example, the current problem of creating a last ROS 1 distribution based on Python 3), but I’m 100% sure that we, and by we I mean the whole ROS community, will solve them and build on top of them.
ROS is a mature, open source project. I would say that ROS is the most mature, open source robotics project in the world. So if you are planning to get into robotics, you need to master it.
In the next article, we are going to move onto the problem of learning ROS. ROS is complex. ROS takes time. So, how do you approach that learning to optimize your time? That is what I’m going to show you in the next article (if you cannot wait until then, let me point you towards the ROS online academy that we have built to speed up your learning).
THIS CONTENT IS OUTDATED. PLEASE DO NOT FOLLOW IT THROUGH. WE ARE UPGRADING THIS CONTENT FOR THE NEW VERSION OF ROSDS IN THE NEXT WEEKS. STAY TUNED.
Intro
First of all, we want to explain what does “Establish a real robot connection” means.
It means that we connect the ROS Development Studio (ROSDS ) remote Server that you launch each time you open a ROSject, with another device.
This device can be Another remote server, a robots computer or your own computer.
The only limitation for the moment is that these devices have to use Ubuntu distribution you guarantee that it will work.
Establish a connection means to be able to ping that device and be able to see from both devices all the topics, services, and ROS related elements, no matter if they are published in the ROSDS or the other device.
Once that explained, let’s see the two steps you have to follow to get this connection up and running.
How to start the Real Robot Connection
To do it you have to follow these three steps:
Install the rosds_real_robot_connection in the device you want to connect to ROSDS. Here you have the Git.
Turn On the Real Robot Connection from the Robots side.
Establish the connection from ROSDS side.
1. Install rosds_real_robot_connection
The first thing you need is the IP_DEVICE and user_name_in_device of your device. To get it you can use two systems:
Connect physically to your Device and execute the following command in a terminal:
ifconfig –> Extract the IP_DEVICE from inet addr:XXX.XXX.X.XXX
whoami –> It will give you the user_name_in_device.
Use an IP scanning app in your phone, like Fing, and search for the name of your device in the list given, where you will find its IP. This is useful when you don’t have access to the device directly or it doesn’t have a screen connected.
Now that you have the IP_DEVICE and the username.
Access the device ( Robots CPU, your computer or another remote server ). You can do this through several methods.
Through SSH: ssh user_name_in_device@IP_DEVICE
Through RemoteDesktop: we recommend using Remina. Just set up the user, password, and IP.
Execute the following commands to install everything:
cd rosds_real_robot_connection sudo ./realrobot_setup.sh
sudo reboot
And that it if all went well you will have the device real robot connection server ready to go. To test that its operational just execute this command once the first reboot has been done:
systemctl status rosds_connector.service
You should get a message similar to this:
ubuntu@ip-172-31-36-185:~$ systemctl status rosds_connector.service
● rosds_connector.service – ROSDS Connector
Loaded: loaded (/etc/systemd/system/rosds_connector.service; enabled; vendor preset: enabled)
Active: active (running) since Wed 2019-09-11 09:36:39 UTC; 24s ago
Main PID: 837 (start.sh)
Tasks: 8 (limit: 4478)
CGroup: /system.slice/rosds_connector.service
├─ 837 /bin/bash /usr/share/rosds_connector/start.sh ubuntu
└─1599 node /usr/share/rosds_connector/bin/www
Sep 11 09:36:39 ip-172-31-36-185 systemd[1]: Started ROSDS Connector.
Sep 11 09:36:43 ip-172-31-36-185 nodejs-example[837]: Now using node v10.15.3 (npm v6.4.1)
Turn On the Real Robot Connection from the Robots side
Now you have to follow these simple steps:
Open a web browser. We recommend google chrome.
Type in the URL: IP_DEVICE:3000
Here is an example of what you should get if the IP_DEVICE=192.168.1.170 and the user_name_in_device=panandtilt
Now you have to click on Turn ON. This will generate the Robot URL that you need to make the connection in ROSDS.
To TURN OFF the connection from the device side, just click on the TURN OFF button. This will sever the link and ROSDS won’t be able to connect anymore until you turn it ON again and update the connection with the new Robot URL generated.
Establish the connection from ROSDS side
For this last step, you need from the previous step:
Robot URL
Device Name
Follow these steps:
You have to click the RealRobot tab, Connect to Robot ON, and after a few minutes, you will be greeted with the configuration window.
Place in the corresponding form input the Robot URL and the Device Name.
Click on CONNECT.
After around 5-30 seconds the connection will have been established.
Now the CONNECTION is ESTABLISHED. By default, the new device is the ROS_MASTER. If you need to change it to ROSDS computer just select it.
Test that everything is working
If you want to be sure that everything is working, we indicate some simple tests you can perform to be sure that your Real Robot Connection is working perfectly:
1. Basic RealRobot connection Testing
Here we test only the raw connection, ROS is not tested here:
You should be able to ping the device from rosdscomputer and ping rosdscomputer from the device.
Inside the Device: ping6 rosdscomputer
Output: PING rosdscomputer(rosdscomputer) 56 data bytes 64 bytes from rosdscomputer: icmp_seq=1 ttl=3 time=49.0 ms
Inside ROSDS web shell: ping6 devicename
Output: PING panandtilt(panandtilt) 56 data bytes 64 bytes from panandtilt: icmp_seq=1 ttl=3 time=48.7 ms
You should be able to do an SSH the Device from ROSDS if you have the password of the device of course:
Inside ROSDS: ssh devicename@IP_DEVICE
Basic ROS Test
Now we can test if ROS is working:
Remember that you have to decide who is the ROS_MASTER, and therefore where you will have to launch the ROSCORE.
Inside the Device: rostopic pub /device_test std_msgs/String "data: 'I am The Device'" -r1
Output: ERROR: Unable to communicate with master!.
Of course, you have to launch the ROSCORE first inside the device if that’s the one you had set up as ROSMASTER!
Inside the Device: roscore
Output2: Nothing. That means that the rostopic publish is working.
Inside ROSDS web shell: rostopic echo /device_test
You should see the message: I am The Device
And now let’s test the other way round:
Inside ROSDS web shell: rostopic pub /rosds_test std_msgs/String "data: 'I am ROSDS'" -r1
Inside the Device: rostopic echo /rosds_test
Connect through ssh from ROSDS:
You can also connect through ROSDS to your connected robot. You just have to:
Install the following in the robots computer:
sudo apt-get install openssh-server
sudo netstat -lntp | grep 22
Doing this you will install ssh server and check that its running.
To connect through ROSDS you need the name of the user in the robots pc, for example “finn”, and the hostname given in the real robot connection config , for example “jacke”. Then the command you need to write in the web shell is:
ssh finn@jacke
This will ask you the password for that user. And that it!
Real Life Example
You have a whole video series that explains how to use this to move a pan and tilt for face recognition using a Raspberry Pi that at the end we will connect through ROSDS.
In this post, we are checking the failure of the Bug 0 algorithm! What is this about?
Bug 0 is not a perfect motion planning algorithm! And at the end of this post, we show how it could be improved using Bug 1. Let’s start!
(If you don’t have the previous post ROSJect, you can copy it from here!)
Step 1 – Check Bug 0 Failure
The first thing is launching a simulation with a new world we have created for this post. It’s an extensive file and it doesn’t make sense, for this post, to explain its code. Please copy the file from here! Paste its content to a new file: ~/simulation_ws/src/my_worlds/worlds/world03.world
And change the launch file at ~/simulation_ws/src/my_worlds/launch/world.launch. Set the value of the argument world to “world03”.
Launch the world as before, from ROSDS simulation menu, choose my_worlds world.launch from the dropdown. You must have the simulation below running.
Next step: Spawn the robot. Let’s use the very same command, just pay attention to the arguments:
roslaunch m2wr_description spawn.launch y:=8
Which must be at the position shown below, more or less:
Again, pay attention to the arguments, we are sending the robot to the center of the maze.
Step 2 – What is happening to Bug 0?
If everything went as expected, you should have seen something like this:
It turns out Bug 0 can’t handle such kind of “maze” (It’s not a challenging maze if we know the map, but for an algorithm without a map, it is!). Bug 0 is always turning to the same side (left or right). That’s why it never reaches the goal.
How can we get rid of it?
Step 3 – Introducing Bug 1
There is another motion planning algorithm, this one called Bug 1. The description of its behavior is:
The robot tries to go straight to the point;
If there is an obstacle, the robot circumnavigates it, in order to find the closest point, from the obstacle to the goal;
When the robot reaches back the initial point, it goes to the closest point noted previously;
From the closest point, the robot tries again going in a straight line to the goal.
And it works in a loop, check below how Bug 1 would perform the task at the same environment:
Go straight to the point (blue line);
Find an obstacle;
Circumnavigates the obstacle until it reaches the initial/entry point (light blue line);
It’s back to the original point;
Go to the closest point found during circumnavigation (purple line);
Reaches the closest point;
Go to the goal in a straight line (blue line again).
Conclusion
We could see how Bug 0 can fail and also a better algorithm to solve the problem. For the next post, what if you try to implement the Bug 1 algorithm using some of the functions we have created so far?
If something went wrong up to here, you can copy the ROSJect here!
See you!
[irp posts=”13469″ name=”Exploring ROS with a 2 wheeled robot #8 – Bug 0″]
Python is a good choice if you want to become a robotics developer (i.e. program robots), especially if you want to program your robots with ROS. On a previous post, I discussed about whether to learn ROS using C++ or using Python. On that post, I argued that if you know neither Python nor C++, nor ROS, then your best bet to get into robotics faster is to learn robotics using Python.
However, as a teacher of ROS at the Master of Robotics and Automation of the University of LaSalle in Barcelona, I find the same situation every year: the students that arrive to learn ROS have no knowledge of Python, or C++. If I have to teach them ROS, I have to start first by teaching them one of those languages.
If I convinced you to learn ROS using Python, then your next step must be to learn Python. Let’s dig in.
Python 2.7 or Python 3?
Ok, so you are fully into learning Python. Then you will hear that, at present, there are two versions of Python available and that they cannot be mixed. That means that programs written for Python 2.7 are not compatible with Python 3, and vice versa. So… which one?
In order to make a decision about this, I think that we have to concentrate on the goal we are trying to achieve. You are here because you want to become a robotics developer. Furthermore, you want to become a robotics developer based on ROS (here are some reasons why you should). So, let’s ask ROS what it thinks.
Which Python version uses ROS?
All previous ROS distributions up to today (ROS Melodic) have been based on Python 2.7. This means that all the code already available that runs for those distributions needs to run on Python 2.7.
However, the next distribution of ROS 1 (ROS Noetic, to be released in 2020) is going to be fully based on Python 3 (that is going to be a lot of fun!), as indicated by main ROS developer Tully Foot in this interview.
Additionally, the next generation of ROS, that is, ROS 2, works only on Python 3, so I think that the answer is clear from the point of view of ROS: learn Python 3.
Which Python version for other important libraries?
Apart from ROS, you will have to use many other libraries to build your robot programs.
For example, if you want to do image manipulation, then you will probably use OpenCV. If you want to use machine learning with the robot, you will probably use Scikit, OpenAI-Baselines, or Tensorflow/Keras. If you want to use numerical computation, you may use scipy, pandas, or numpy. Except for OpenAI-Baselines, all of them work with both Python versions.
Check this page for a complete list of the most important Python libraries ready for Python 3.
Additional considerations
On top of all that, Python 2.7 is going to reach the end of its life in 2020. That doesn’t mean that your Python 2.7 programs will stop working on that date! It just means that no additional development will be done by the organization that develops Python (that is, The Python Software Foundation). So, all the Python software already existing will continue to work, and you will still be able to develop for it, too, but no new bug fixing.
Finally, to clarify that from a beginner’s point of view, the differences between Python 2.7 and Python 3 are very small. My advice is, if you are a beginner, it will be a safe bet to start with Python 3, and then, learn the small differences that affect your personal case when they arise.
Python for Robotics Course
What to learn of Python
If you want to start quick with robotics and ROS, I would say that you only need to understand the following basic concepts of Python:
1. How to create and execute a Python program
This is the basic step that will allow you to run your Python programs and also the programs of third parties.
Included here are:
How to create a Python file
How to include (import) external libraries into your Python programs (and how to use them)
What is the role of indentation in Python
How to provide execution permissions to the Python files
An example of code that must be understood after this step is the following code:
#!/usr/bin/env python
import rospy
if __name__ == "__main__":
rospy.init_node("my_node")
print ("Keep pushing your ROS learning!")
You must understand how to create that program, how to provide permissions for execution, and how to actually execute it.
2. Basic Python language:
In this step, you must understand the types of variables (including lists and dictionaries), control loops, conditionals, and operators that Python provides to operate.
This is the typical basic knowledge that you must learn for each programming language. Not a lot to say here…
3. Functions
A function is a block of organized, reusable code that is used to perform a single, related action. Functions provide better modularity for your application and a high degree of code reusing. Python provides many built-in functions, like print(), etc., but you can also create your own.
The important points to understand about functions are:
How to create a function: definition
How to call a function
How to provide arguments
How to return values
After this step, you need to understand the following simple example of a function that receives a parameter and returns a value when called from the main program:
#!/usr/bin/env python
import rospy
def print_message(name):
sentence = name + " keep pushing your ROS learning... "
return sentence
if __name__ == "__main__":
rospy.init_node("my_node")
full_sentence = print_message ("Rick")
print (full_sentence)
4. Classes and objects
A Python class (also called an object), is simply a collection of data (variables) and methods (functions) that act on those data. Classes are the basis of good robotics programming and you must master them if you want to create complex robotics programs that do not become spaghetti. The main concepts about classes that you must master are:
How to create (define) a class. Understand what the members of the class are and the constructor. Pay special attention to the word self, which indicates that that variable/function is a member of the class
How to instantiate a class
How to call the members of a class from an external program
After this section, you should be able to understand the following code, which makes a robot named Pepito move forward:
Once you know the aforementioned concepts, you are ready to move on to learning ROS. Of course, there is a lot more about Python than what I have indicated above, but you don’t need to master the language to start using it to learn ROS. Remember, the goal here is to learn and apply ROS to the control of robots.
Learn Python for robotics
In the previous section, I indicated that you must learn Python if you want to become a robotics developer. However, what I really recommend is that you learn Python while applying it to robot control.
I especially do not like learning about programming with simple examples of lists of names, conditionals based on stupid numbers, and so on. I do prefer to learn those programming concepts with a robotics application in mind. For example, I would like to know that a list allows me to include the full list of ranges obtained from the laser, which indicates distances to the closest obstacles. Or conditionals based on how close the obstacles are to the robot. Or loops for controlling the speed of the wheels on the robot.
What I mean is that the learning of Python concepts has to be done while applying it to a robotics situation. This method makes it possible to associate robotics concepts with the Python language. You become a robotics developer at the same time that you learn the language.
With this purpose in mind, we have created a specific course in which we teach Python while applying your learning to the control of robots. This is a completely free course where we teach all the Python 3 concepts discussed above while you apply it to simulated robots.
The course is available, entitled Python 3 for Robotics, and has the following chapters:
Some interesting questions before starting the course
Do I have to know about robotics?
No. You will be introduced to the required (simple) concepts of robotics while learning Python.
Do I have to know about ROS?
No. Even if you are going to control (simulated) robots throughout the course, you are not going to use ROS for it. The robots that you will use run ROS underneath, but it will be hidden from you. You will get data from the sensors and send commands to the wheels using pure Python classes provided by us.
Is Python the best language for robotics?
This is a very common question and the answer is that it depends on what you want to do, which type of robotics application you want to build, and which type of robot you want to use. In a general sense, I would say that if you are doing robotics research, then Python is your best bet. In case you are working for a company, then maybe C++ is better suited for you, even if using Python in that situation can help you to quickly build prototypes and test your ideas.