How to Install a USB Camera in TurtleBot3 – ROS Q&A #220

How to Install a USB Camera in TurtleBot3 – ROS Q&A #220

What we are going to learn

  1. How to install the ROS driver cv_camera
  2. How to add Camera link to TurtleBot3 transform frame tree

List of resources used in this post

  1. ROS Development Studio (ROSDS) —▸ http://rosds.online
  2. This post answers the following question: https://answers.ros.org/question/329102/which-camera-for-turtle3-burger-and-how-to-plug-it/
  3. cv_camera package: https://wiki.ros.org/cv_camera
  4. Turtlebo3 e-manual: https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/
  5. Turtlebot3 repository: https://github.com/ROBOTIS-GIT/turtlebot3
  6. ROS2 Tutorials –▸
    1. ROS2 Basics in 5 Days (C++): https://app.theconstructsim.com/#/Course/61
    2. ROS2 Basics in 5 Days (Python): ROS2 Basics for Python

Installing ROS OpenCV camera driver

In order to install the cv_camera driver, the first thing you have to do is to connect to your real robot.

If you are reading this tutorial, then we assume you already know how to connect to your own robot using SSH.

After you have connected, you can install the cv_camera package either by directly compiling the https://github.com/OTL/cv_camera package or by using apt-get, which is what we are going to do.

Important things about the cv_camera package:

  • It publishes the images on the ~image_raw topic (sensor_msgs/Image)
  • It uses camera as the default value of the ~frame_id parameter.

Ok, let’s finally install the cv_camera package. The commands would be the following:

sudo apt update 

sudo apt install ros-[YOUR-ROS-DISTRO]-cv-camera

Remember to replace [YOUR-ROS-DISTRO] with your ROS Distro, like melodic, or noetic, for example.

Once you have this installed in your TurtleBot3, you can plug it in a USB port and test it running this node:

source /opt/ros/[YOUR-ROS-DISTRO]/setup.bash

rosrun cv_camera cv_camera_node

 

If it works, then create a launch file for the node and also a static transform publisher from base_link to camera_link, indicating the approximate distance to where you end up placing your camera. The file would look something like this:

<launch>
  <node pkg="cv_camera" type="cv_camera_node" name="cv_camera" output="screen"/>
  <node pkg="tf" type="static_transform_publisher" name="camera_frames_pub" args="0.05 0.0 0.1 0 0 0 /base_link /camera 35"/>
</launch>

After launching the launch file we just created, using roslaunch your_package_name_here your_launch_file.launch, you should be able to list the topics by using rostopic list. Among other topics, the output of rostopic list should show the following topic:

/cv_camera

 

If you now open RViz, you should be able to see the camera.

If you want a step-by-step walk-through of the instructions above, you can check out this video below.

Youtube video

So this is the post for today. Remember that we have the live version of this post on YouTube. If you liked the content, please consider subscribing to our youtube channel. We are publishing new content ~every day.

Keep pushing your ROS Learning.

Related Courses & Training

If you want to learn more about ROS and ROS2, we recommend the following courses:

Spawning multiple robots in Gazebo with ROS2

Spawning multiple robots in Gazebo with ROS2

What we are going to learn

  1. How to spawn N number of robots in ROS2 Foxy Gazebo simulation with each one its own namespace, TFs, imu data, and cmd_vel.

List of resources used in this post

  1. Use the rosject: https://app.theconstructsim.com/#/l/3da0aea5/
  2. ROS Development Studio (ROSDS) —▸ http://rosds.online
  3. ROS2 Tutorials –▸
    1. ROS2 Basics in 5 Days (C++): https://app.theconstructsim.com/#/Course/61
    2. ROS2 Basics in 5 Days (Python): ROS2 Basics for Python
  4. Box Bot repository: https://bitbucket.org/theconstructcore/box_bot/src/foxy/
  5. The question that this post answers: https://answers.ros.org/question/372730/spawning-multiple-robots-in-gazebo-with-ros2-with-namespaces/

Opening the rosject

In order to learn how to spawn many robots in ROS2, we need to have ROS installed. We already prepared a rosject ready for that: https://app.theconstructsim.com/#/l/47f74f81/. You can download the rosject on your own computer if you want to work locally, but just by copying the rosject (clicking the link above), you will have a setup already prepared for you.

After the rosject has been successfully copied to your own area, you should see a Run button. Just click that button to launch the rosject.

RUN rosject - Multiple Box Bot in ROS2

RUN rosject – Multiple Box Bot in ROS2

Setting up our environment to use purely ROS2 Foxy

On The Construct, when you open the rosject, the system will have ROS1 Noetic and ROS2 Foxy installed.

Let’s start by making our system able to compile purely with ROS2. For that, let’s first open a terminal:

Open a new Terminal

Open a new Terminal

After opening the terminal, let’s now override the default ~/.bash_aliases to remove some ROS1-related settings :

sudo cp /home/user/ros2_ws/src/.bash_aliases /home/user/

 

After having replaced the ~/.bash_aliases file, please close the terminal window and open a totally new one, in order to make sure you will have ROS2 setup.

After that, you should be able to type now the command:

whoistheboss

 

Compiling our ~/ros2_ws and launching the simulation

Before launching the simulation, let’s first compile our box_bot_description package with ROS2. Let’s open a terminal and run the following commands:

cd
cd ros2_ws/
rm -rf build install log
source /opt/ros/foxy/setup.bash
colcon build --symlink-install --packages-select box_bot_gazebo box_bot_description

 

Now that our box_bot_description package is compiled, let’s run it with the following commands:

source install/setup.bash
ros2 launch box_bot_gazebo multi_box_bot_launch.py

 

Opening Gazebo to see the robots spawned

Now that the robots have been spawned, you can click the Open Gazebo to see the 10 robots that were spawned:

Click Open Gazebo to see the 10 robots spawned

Click Open Gazebo to see the 10 robots spawned

 

Making sure the robots are each under their own namespace

To make sure our robots are each one under a specific namespace, one namespace for each robot, let’s list the topics in another terminal by running ros2 topic list:

The output should be similar to the following:

 ros2 topic list/box_bot0/cmd_vel
/box_bot0/data
/box_bot0/odom
/box_bot0/tf
/box_bot1/cmd_vel
/box_bot1/data
/box_bot1/odom
/box_bot1/tf
/box_bot2/cmd_vel
/box_bot2/data
/box_bot2/odom
/box_bot2/tf
/box_bot3/cmd_vel
/box_bot3/data
/box_bot3/odom
/box_bot3/tf
/box_bot4/cmd_vel
/box_bot4/data
/box_bot4/odom
/box_bot4/tf
/box_bot5/cmd_vel
/box_bot5/data
/box_bot5/odom
/box_bot5/tf
/box_bot6/cmd_vel
/box_bot6/data
/box_bot6/odom
/box_bot6/tf
/box_bot7/cmd_vel
/box_bot7/data
/box_bot7/odom
/box_bot7/tf
/box_bot8/cmd_vel
/box_bot8/data
/box_bot8/odom
/box_bot8/tf
/box_bot9/cmd_vel
/box_bot9/data
/box_bot9/odom
/box_bot9/tf
/clock
/parameter_events
/performance_metrics
/rosout

Here you can see that we really have each robot under a different namespace.

Moving robots independently

We see that we have namespaces going from 0 to 9

In order to run a specific robot, let’s say the robot under the namespace box_bot9 , we can do it by running the following command:

ros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args --remap cmd_vel:=/box_bot9/cmd_vel

In this command, we are basically running the teleop_twist_keyboard, and remapping the cmd_vel topic to /box_bot9/cmd_vel.  This way, we are moving the 10th robot. You can control different robots just by replacing box_bot9 with box_bot3 or box_bot7, for example.

If everything went well, you should see the following:

This node takes keypresses from the keyboard and publishes them
as Twist messages. It works best with a US keyboard layout.
---------------------------
Moving around:
   u    i    o
   j    k    l
   m    ,    .

For Holonomic mode (strafing), hold down the shift key:
---------------------------
   U    I    O
   J    K    L
   M    <    >

t : up (+z)
b : down (-z)

anything else : stop

q/z : increase/decrease max speeds by 10%
w/x : increase/decrease only linear speed by 10%
e/c : increase/decrease only angular speed by 10%

CTRL-C to quit

By pressing the keys “i“, “o“, “k“, you can see in Gazebo that a specific robot will be moving around.

Understanding how the multiples robots are spawned

You may be wondering how the spawn configuration actually worked.

For that, you can have a look at the following file:

ls ~/ros2_ws/src/box_bot/box_bot_description/launch/multi_spawn_robot_launch.py

You can for sure see the file in the Code Editor. For that, just open the File Editor and open that multi_spawn_robot_launch.py file.

Open the IDE - Code Editor

Open the IDE – Code Editor

The many robots are created basically in the gen_robots_list function.

It is worth mentioning that in order to make this work, we also had to remap the /tf topic in the URDF file.

Congratulations. You now know how to spawn many robots under different namespaces using ROS2.

Youtube video

So this is the post for today. Remember that we have the live version of this post on YouTube. If you liked the content, please consider subscribing to our youtube channel. We are publishing new content ~every day.

Keep pushing your ROS Learning.

Related Courses & Training

If you want to learn more about ROS and ROS2, we recommend the following courses:

RVIZ2 Tutorials Episode1: Learn TF

RVIZ2 Tutorials Episode1: Learn TF

What we are going to learn

  1. How to use RVIZ2 step by step
  2. How to add the TFs of a simulation
  3. How to add the TFs of a real robot
  4. How to connect to a real robot through our robot lab
  5. Start the ros1 bridge to be able to see TF data from ROS1 in ROS2

List of resources used in this post

  1. Use the project: https://app.theconstructsim.com/#/l/47f74f81/
  2. ROS Development Studio (ROSDS) —▸ http://rosds.online
  3. ROS2 Tutorials –▸
    1. ROS2 Basics in 5 Days (Python): https://app.theconstructsim.com/#/Course/73
    2. ROS2 Basics in 5 Days (C++): https://app.theconstructsim.com/#/Course/61

Welcome

Hi, and Welcome to this new series of posts.

In this series, we are going to explain how to add elements to RViz2. We will start with TF, then we will continue with Robots Description, then Cameras, PointClouds, Lasers, and so on.

In this first post, we are going to see how to add TFs in RViz2.

Opening the rosject

In order to learn how to add TFs in RViz2, we need to have ROS installed. We already prepared a rosject ready for that: https://app.theconstructsim.com/#/l/47f74f81/. You can download the rosject on your own computer if you want to work locally, but just by copying the rosject (clicking the link), you will have a setup already prepared for you.

After the rosject has been successfully copied to your own area, you should see a Run button. Just click that button to launch the rosject.

Run rosject: RVIZ2 Tutorials Episode1: TF

Run rosject: RVIZ2 Tutorials Episode1: TF

Running the rosject

After you have clicked the Run button, the rosject should load and you should see a notebook that is open automatically:

Notebook - Run rosject: RVIZ2 Tutorials Episode1: TF

Notebook – Run rosject: RVIZ2 Tutorials Episode1: TF

The notebook contains the instructions to launch a simulation.

Launching the BOX_BOT simulation

In order to launch the simulation, let’s start by opening a new terminal:

Open a new Terminal

Open a new Terminal

After having the first terminal open, let’s run the following commands to launch a simulation:

cd ~/ros2_ws/
colcon build --packages-select box_bot_description box_bot_gazebo ;

source install/setup.bash ;

ros2 launch box_bot_gazebo start_world_launch.py

 

The commands we just executed only launch the world, but there is no robot on it yet.

Now, in a second terminal, let’s spawn the robot with the commands below:

cd ~/ros2_ws/

source install/setup.bash

ros2 launch box_bot_gazebo spawn_robot_ros2.launch.py

 

Now that the robot has been successfully spawned, let’s now open RViz2. For that, let’s run the following commands in a third terminal.

cd ~/ros2_ws/

source install/setup.bash

rviz2

 

See the simulation

In order to see the simulation, you can click on the Open Gazebo button:

Open Gazebo by clicking Open Gazebo

Click Open Gazebo to view the Gazebo simulation

Click Open Gazebo to view the Gazebo simulation

The final image you get should be similar to the following:

Open Gazebo - RVIZ2 Tutorials Episode1 TF

Open Gazebo – RVIZ2 Tutorials Episode1 TF

See rviz2 in the Graphical Tools

In order to see rviz2 (RViz2), you have to click the Open Graphical Tools button, in case the graphical window did not open automatically:

Open Graphical Tools / rviz

Open Graphical Tools / rviz

 

Now that you can see rviz2, let’s set the fixed frame on the top left to /odom (Odometry), then let’s click the Add button on the bottom left and add a TF.

Once TF is added, let’s disable the “All Enabled” option to avoid getting confused with all those TFs that are shown. Let’s shown only specific frames: chassis, left_wheel, odom, right_wheel. In the end, our rviz2 should look similar to the following:

Specific Frames - RVIZ2 Tutorials Episode1_ TF 3

Specific Frames – RVIZ2 Tutorials Episode1_ TF 3

 

We must also click Show Names to show the frame names in RViz2.

Moving the robot with teleop

In order to see the frames of the robot moving in rviz2, let’s run the Keyboard Teleop in a fourth terminal using the following commands:

cd ~/ros2_ws/

source install/setup.bash

ros2 run teleop_twist_keyboard teleop_twist_keyboard

 

If you now press the keys to move the robot around, you should be able to see the robot moving in the simulation, and you should also see the TFs moving in RViz2.

Moving the robot with teleop - Specific Frames - RVIZ2 Tutorials Episode1 TF

Moving the robot with teleop – Specific Frames – RVIZ2 Tutorials Episode1 TF

Connecting to a real robot

In order to see the TFs of a real robot, let’s click on the Real Robot Lab button that is available when you are not on the “Desktop” page of The Construct. After clicking Real Robot Lab, please select a robot and click Book Now. It’s free.

Real Robot Lab - Make a reservation

Real Robot Lab – Make a reservation

 

When you have the robot available for you, open the rosject and click on the button to connect to the real robot.

Connecting to the Real Robot Lab

Connecting to the Real Robot Lab

 

After the connection is successfully established, you should have something similar to the following:

Real Robot Lab connected

Real Robot Lab connected

Launch ros1_bridge

Before loading ros1_bridge, let’s load the parameters that we want to do the bridge.

Open a terminal and type the following:

cd ~/catkin_ws

roslaunch load_params load_params.launch

 

The command above loads basically the instructions saying: Hey, ros1, publish these specific topics because I want them to be available on ROS2.

Now, let’s run the parameter_bridge in a second terminal:

cd ~/ros2_ws/

source /opt/ros/foxy/setup.bash

source install/setup.bash

ros2 run ros1_bridge parameter_bridge

Now that ros1_bridge is running, we should be able to easily see ros2 topics in a third terminal:

cd ~/ros2_ws/

source install/setup.bash

ros2 topic list

You should see some topics listed, and the one we are most interested in today is the /tf topic.

If we now just open rviz2:

rviz2

we should now do the same procedure that we did previously, in order to see the TF data in RViz. Please refer to the section See rviz2 in the Graphical Tools already explained above (set /odom as the Fixed frame, and add the TF panel).

Moving the robot with ROS1

Now that we are connected to our robot and we have RViz2 setup ready, we can move the robot either with ROS1 or ROS2.

Let’s move it with ROS1, by running the following commands in a fourth terminal:

source ~/catkin_ws/devel/setup.bash

rosrun teleop_twist_keyboard teleop_twist_keyboard.py

 

If you now move the robot by following the instructions, you should see the reflections in the camera of the Real Robot, and also in the RViz2:

Seeing real robot TFs in RVIZ2

Seeing real robot TFs in RVIZ2

Congratulations. You now know how to see TF data in RViz2.

Youtube video

So this is the post for today. Remember that we have the live version of this post on YouTube. If you liked the content, please consider subscribing to our youtube channel. We are publishing new content ~every day.

Keep pushing your ROS Learning.

Related Courses & Training

If you want to learn more about ROS and ROS2, we recommend the following courses:

ROS1 or ROS2 for your Robotics Product?

ROS1 or ROS2 for your Robotics Product?

Overview

This post tries to answer the following question: ROS1 vs ROS2: Which one is the best for launching faster a better product?

The post is an intro to the webinar where we discussed which version of ROS to use (ROS1 or ROS2) for your next robotics product.

We analyzed the question from the point of view of a company that wants to deliver a professional product, not from the point of view of a ROS Developer.

Would you like to make your team ROS-ready? Quickly and effectively transform your IT developers into robot app developers online. Please have a look at the ROS & Robotics Online Training Solutions For Enterprise.

Quick and Dirty answer

  • Use ROS1: if your product is going to be released before the end of 2021
  • Use ROS2: if you plan to release your product after 2021

Longer answer (Using ROS1)

The reason why ROS1 may be a good choice for you is that ROS1 is already really mature, which allows you to move really fast since it has a ton of mature features ready to use.

The ideal would be: You make the robot available with ROS1, check if it is successful. If that is the case, you can always make a transition from ROS1 to ROS2 in the next years, once you know from the market that your product is something valid.

Longer answer (Using ROS2)

If you are planning to release your product after 2021, go and take ROS2.

The reason why you should ideally start already with ROS2 is that ROS2 is the future of ROS, therefore, the future of robotics. Your robot will already be ready for what is coming for the next years.

ROS1 vs ROS2 Evaluation Questions for Companies

If you want to know the answers for deeper ROS-related important questions for companies, like the ones below:

  1. Which ROS version has all the technical requirements that my product needs?
  2. Which ROS version for product development?
  3. Which ROS version for product release?

Please, take our Full Webinar available at The Construct for enterprise: https://www.theconstruct.ai/ros-team-training-3/

Important things you have to know before starting with ROS2:

  • Doing everything in ROS2 is going to be a little more difficult than if you were to do with ROS1. You will need more time to do anything in ROS2.
  • The reason is the same already explained: ROS1 is more mature, so, it is very likely that whatever you want to do in ROS1, someone may have already done it, therefore, you don’t have to reinvent the wheel, whereas in ROS2, is less likely to have everything that you need already implemented.

Youtube video

So this is the post for today. Remember that we have the live version of this post on YouTube. If you liked the content, please consider subscribing to our youtube channel. We are publishing new content ~every day.

Keep pushing your ROS Learning.

Related Courses & Training

If you want to learn more about ROS and ROS2, we recommend the following courses:

[ROS2 Q&A] 228 – How to work with rosbags and laser_assembler in ROS2

[ROS2 Q&A] 228 – How to work with rosbags and laser_assembler in ROS2

What we are going to learn

  1. How to record your own rosbag
  2. How to use laser_assembler package
  3. How to visualize results from ROS2

List of resources used in this post

  1. Use the rosject: https://app.theconstructsim.com/#/l/4733ef97/
  2. ROS Development Studio (ROSDS) —▸ http://rosds.online
  3. Robot Ignite Academy –▸ https://www.robotigniteacademy.com
  4. [ROS Q&A] 047 – How to use the laser_assembler package: https://www.youtube.com/watch?v=MyA0as18Wkk
  5. ROS2 Tutorials –▸
    1. ROS2 Basics in 5 Days (C++): https://app.theconstructsim.com/#/Course/61
    2. ROS2 Navigation: https://app.theconstructsim.com/#/Course/50
    3. Get ROS2 Industrial Ready 3-Day Online Workshop
  6. The question that this post answers: https://answers.ros.org/question/393958/unable-to-visualize-point-cloud-in-rviz-using-rosbag-in-ros2/

Creating a rosject

In order to learn how to see data coming from ROSBags in ros2 depth camera data, we need to have ROS installed. We already prepared a rosject ready for that: https://app.theconstructsim.com/#/l/4733ef97/. You can download the rosject on your own computer if you prefer to work locally.

If you want to create your own rosject instead of using the link we just provided, you can do it also. We are going to use The Construct (https://www.theconstruct.ai/) for this tutorial, but if you have ROS2 installed on your own computer, you should be able to do ~everything on your own computer, except this creating a rosject part.

Let’s start by opening The Construct (https://www.theconstruct.ai/) and logging in. You can easily create a free account if you still don’t have one.

Once inside, let’s create My Rosjects and then, Create a new rosject if you decided taking this route:

My Rosjects

My Rosjects

 

Create a new rosject

Create a new rosject

For the rosject, let’s select ROS2 Foxy for the ROS Distro, let’s name the rosject as you want. You can leave the rosject public. You should see the rosject you just created in your rosjects list (the name is certainly different from the example below that was added just for learning purposes)

List of rosjects - Using Depth camera in ROS2 to determine object distance

List of rosjects – Using Depth camera in ROS2 to determine object distance

If you mouse over the recently created rosject, you should see a Run button. Just click that button to launch the rosject.

Launching the laser_scan_assember server to convert from laser scan to point cloud

Assuming you have a copy of the rosject, after opening it by clicking the Run button mentioned earlier, let’s open the IDE (Code Editor) to see its content.

Open the IDE - Code Editor

Open the IDE – Code Editor

You should see a rosbag file named 2022-01-17-11-40-32.bag, which is a rosbag for the turtlebot simulation.

In order to launch the laser_scan_assembler, we need a launch file. If you check carefully, you may notice that we already have a launch file under the path ~/catkin_ws/src/laser_assembler_demo/launch/assembler.launch.

Its content is as follows:

<launch>
    <node type="laser_scan_assembler" pkg="laser_assembler" name="my_assembler">
      <!-- <remap from="scan" to="tilt_scan"/> -->
      <param name="max_scans" type="int" value="400" />
      <param name="fixed_frame" type="string" value="base_footprint" />
    </node>
</launch>

Well, before launching that file, let’s learn how to record a rosbag file, in case you want at some point create your own:

Creating a rosbag file

In order to create a rosbag file, let’s start by opening a new terminal:

Open a new Terminal

Open a new Terminal

After opening the shell, let’s source ROS Noetic:

source /opt/ros/noetic/setup.bash

Let’s also source the simulation_ws (simulation workspace), because it is where we have the turtlebot simulation

source simulation_ws/devel/setup.bash

rospack profile

Now, let’s launch a turtlebot simulation with the following commands:

export TURTLEBOT3_MODEL=burger

roslaunch turtlebot3_gazebo turtlebot3_world.launch

You can click Open Gazebo to make sure the simulation is running (although for you it may not show the blue laser in gazebo)

Click Open Gazebo to view the Gazebo simulation

Click Open Gazebo to view the Gazebo simulation

You can now open a new web shell and list the topics with the following command:

source /opt/ros/noetic/setup.bash

rostopic list

The list of topics should be something like the following:

/clock
/cmd_vel
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/performance_metrics
/gazebo/set_link_state
/gazebo/set_model_state
/imu
/joint_states
/odom
/rosout
/rosout_agg
/scan
/tf

So, in order to record all topics in a  rosbag file, you can just run the rosbag record -a command. You can press CTRL+C to stop recording, or youcan also especify the –duration param if you want to record only for a few seconds or minutes (because rosbag files grow really fast):

rosbag record -a --duration=1m

The logs should be similar to the following:

[ INFO] [1643640738.398965449, 298.390000000]: Recording to '2022-01-31-14-52-18.bag'.
[ INFO] [1643640738.402667915, 298.395000000]: Subscribing to /rosout_agg
[ INFO] [1643640738.408196688, 298.402000000]: Subscribing to /rosout
[ INFO] [1643640738.417414694, 298.406000000]: Subscribing to /clock
[ INFO] [1643640738.492442317, 298.422000000]: Subscribing to /gazebo/link_states
[ INFO] [1643640738.511223062, 298.468000000]: Subscribing to /gazebo/model_states
[ INFO] [1643640738.518154757, 298.481000000]: Subscribing to /gazebo/performance_metrics
[ INFO] [1643640738.526057506, 298.488000000]: Subscribing to /gazebo/parameter_descriptions
[ INFO] [1643640738.598167676, 298.535000000]: Subscribing to /gazebo/parameter_updates
[ INFO] [1643640738.784914410, 298.633000000]: Subscribing to /scan
[ INFO] [1643640738.795856579, 298.650000000]: Subscribing to /joint_states
[ INFO] [1643640738.806535006, 298.665000000]: Subscribing to /tf
[ INFO] [1643640738.817849095, 298.673000000]: Subscribing to /odom
[ INFO] [1643640738.901146360, 298.732000000]: Subscribing to /imu

Let’s use the rosbag file that already exists, called 2022-01-17-11-40-32.bag.

Playing a rosbag file

Now that we have learned how to record a rosbag file, let’s play the one that we already had, called 2022-01-17-11-40-32.bag.

Before that, let’s kill the simulation we launched previously (pressing CTRL+C in the terminal where we launched the simulation).

If we now check the topics again, we should have only the following 3 topics:

rostopic list

  /clock
  /rosout
  /rosout_agg

Let’s first start roscore:

roscore &

If you wait about ~5 seconds, roscore will be running. After that, we can run rosbag:

rosbag  play 2022-01-17-11-40-32.bag

The output would be similar to the following:

setting /run_id to 5cbe4c68-82c9-11ec-bbc7-0242c0a86007
[ INFO] [1643656195.261386757]: Connected to master at [1_xterm:11311]
[ INFO] [1643656195.265748530]: Opening 2022-01-17-11-40-32.bag
process[rosout-1]: started with pid [1430]
started core service [/rosout]

Waiting 0.2 seconds after advertising topics... done.

Hit space to toggle paused, or 's' to step.
[RUNNING]  Bag Time:    345.418104   Duration: 13.739104 / 70.197000

If we now check the topics again in another terminal, we will have the topics listed again:

source /opt/ros/noetic/setup.bash

rostopic list

which would give us:

/clock
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/performance_metrics
/imu
/joint_states
/odom
/rosout
/rosout_agg
/scan
/tf
/tf_static

You can now kill the rosbag play process, now that you learned how to play it and how it works.

Launching the laser_scan_assember server to convert from laser scan to point cloud

Let’s start by opening 6 terminals. If you want, you can name them like the names below to easily know what each terminal is doing:

  • 1 – roscore
  • 2 – rosbag
  • 3 – assembler
  • 4 – call_assembler
  • 5 – bridge
  • 6 – rviz2

In the end, you would have something like the following image:

How to work with rosbags and laser_assembler in ROS2 14-36

How to work with rosbags and laser_assembler in ROS2 14-36

Let’s launch roscore in the first terminal (named roscore) if you have killed it:

source /opt/ros/noetic/setup.bash

roscore

Now let’s run rosbag play in the second terminal:

source /opt/ros/noetic/setup.bash

while true; do rosbag play 2022-01-17-11-40-32.bag; sleep 1; done

The reason why we are running inside a while loop is because it finishes after ~70 seconds, which is the duration of the rosbag file. By being inside the while loop, it keeps running “forever” until we stop it with CTRL+C.

Now in the third terminal, let’s run the laser assembler:

source ~/catkin_ws/devel/setup.bash

roslaunch laser_assembler_demo assembler.launch

Let’s now in the 4th terminal and list the services:

source ~/catkin_ws/devel/setup.bash

rosservice list

You should see the services running.:

/assemble_scans
/assemble_scans2
/build_cloud
/build_cloud2
/my_assembler/get_loggers
/my_assembler/set_logger_level
/play_1643657183488619006/get_loggers
/play_1643657183488619006/pause_playback
/play_1643657183488619006/set_logger_level
/rosout/get_loggers
/rosout/set_logger_level

Still in the 4th terminal, we can now run the node used to call the assembler:

 rosrun laser_assembler_demo call_assembler.py

The output you should get must be similar to the following:

Got cloud with 2592000 points
Got cloud with 2592000 points
Got cloud with 2592000 points
Got cloud with 2592000 points
...

Everything so far was running in ROS1. Let’s now run ros1_bridge in the 5th terminal to be able to see the topics in ROS2.

source /opt/ros/noetic/setup.bash

source /opt/ros/foxy/setup.bash

ros2 run ros1_bridge  dynamic_bridge --bridge-all-topics

Among other text output, you should see the following:

created 1to2 bridge for topic '/clock' with ROS 1 type 'rosgraph_msgs/Clock' and ROS 2 type 'rosgraph_msgs/msg/Clock'
created 1to2 bridge for topic '/gazebo/link_states' with ROS 1 type 'gazebo_msgs/LinkStates' and ROS 2 type 'gazebo_msgs/msg/LinkStates'
created 1to2 bridge for topic '/gazebo/model_states' with ROS 1 type 'gazebo_msgs/ModelStates' and ROS 2 type 'gazebo_msgs/msg/ModelStates'
created 1to2 bridge for topic '/imu' with ROS 1 type 'sensor_msgs/Imu' and ROS 2 type 'sensor_msgs/msg/Imu'
created 1to2 bridge for topic '/joint_states' with ROS 1 type 'sensor_msgs/JointState' and ROS 2 type 'sensor_msgs/msg/JointState'
created 1to2 bridge for topic '/laser_pointcloud' with ROS 1 type 'sensor_msgs/PointCloud2' and ROS 2 type 'sensor_msgs/msg/PointCloud2'
created 1to2 bridge for topic '/odom' with ROS 1 type 'nav_msgs/Odometry' and ROS 2 type 'nav_msgs/msg/Odometry'
created 1to2 bridge for topic '/rosout' with ROS 1 type 'rosgraph_msgs/Log' and ROS 2 type 'rcl_interfaces/msg/Log'
created 1to2 bridge for topic '/rosout_agg' with ROS 1 type 'rosgraph_msgs/Log' and ROS 2 type 'rcl_interfaces/msg/Log'
created 1to2 bridge for topic '/scan' with ROS 1 type 'sensor_msgs/LaserScan' and ROS 2 type 'sensor_msgs/msg/LaserScan'
created 1to2 bridge for topic '/tf' with ROS 1 type 'tf2_msgs/TFMessage' and ROS 2 type 'tf2_msgs/msg/TFMessage'
created 1to2 bridge for topic '/tf_static' with ROS 1 type 'tf2_msgs/TFMessage' and ROS 2 type 'tf2_msgs/msg/TFMessage

Now, let’s launch rviz2 in the 6th terminal:

rviz2

If you now open the Graphical Tools, you should be able to see rviz:

Open Graphical Tools / rviz

Open Graphical Tools / rviz

Once you have opened the Graphical Tools, let’s change the Fixed Frame in rviz2 to /base_footprint  instead of /map.

Let’s also click Add (in the left bottom of rviz2) and select a Point Cloud2. After the PointCloud2 was added, let’s select the topic /laser_pointcloud (as defined on ~/catkin_ws/src/laser_assembler_demo/src/call_assembler.py):

Now, you should be able to see the laser being shown in rviz2 (in red).

Laser Pointcloud being output in rviz2 with ros1_bridge

Laser Pointcloud being output in rviz2 with ros1_bridge

Congratulations. You now know how to work with rosbags and laser_assembler in ROS2.

Youtube video

So this is the post for today. Remember that we have the live version of this post on YouTube. If you liked the content, please consider subscribing to our youtube channel. We are publishing new content ~every day.

Keep pushing your ROS Learning.

Related Courses & Training

If you want to learn more about ROS and ROS2, we recommend the following courses:

[ROS2 Q&A] 227 – Work with ROS2 depth camera data

[ROS2 Q&A] 227 – Work with ROS2 depth camera data

What we are going to learn

  1. How to obtain depth camera data through ROS1 simulation
  2. How to use topics offered by camera to determine object distance

List of resources used in this post

  1. Use the rosject: https://app.theconstructsim.com/#/l/467026ab/
  2. ROS Development Studio (ROSDS) —▸ http://rosds.online
  3. Robot Ignite Academy –▸ https://www.robotigniteacademy.com
  4. ROS2 Tutorials –▸
    1. ROS2 Basics in 5 Days (C++): https://app.theconstructsim.com/#/Course/61
    2. ROS2 Navigation: https://app.theconstructsim.com/#/Course/50
    3. Get ROS2 Industrial Ready 3-Day Online Workshop

Creating a rosject

In order to learn how work with ros2 depth camera data, we need to have ROS installed.We already prepared a rosject ready for that: https://app.theconstructsim.com/#/l/467026ab/

If you want to create your own rosject instead of using the link we just provided, you can do it also. We are going to use The Construct (https://www.theconstruct.ai/) for this tutorial, but if you have ROS2 installed on your own computer, you should be able to do ~everything on your own computer, except this creating a rosject part.

Let’s start by opening The Construct (https://www.theconstruct.ai/) and logging in. You can easily create a free account if you still don’t have one.

Once inside, let’s create My Rosjects and then, Create a new rosject if you decided decided taking this route:

My Rosjects

My Rosjects

 

Create a new rosject

Create a new rosject

For the rosject, let’s select ROS2 Foxy for the ROS Distro, let’s name the rosject as you want. You can leave the rosject public. You should see the rosject you just created in your rosjects list (the name is certainly different from the example below that was added just for learning purposes)

List of rosjects - Using Depth camera in ROS2 to determine object distance

List of rosjects – Using Depth camera in ROS2 to determine object distance

If you mouse over the recently created rosject, you should see a Run button. Just click that button to launch the rosject.

Starting a simulation

In order to launch a simulation, let’s start by opening a new terminal:

Open a new Terminal

Open a new Terminal

We can actually open 3 terminals (just by clicking the plus “+” button that appears besides the number #569 in the image above).

At the moment there are not a lot of simulations that include depth cameras for ros2, therefore, we are going to use ros1 to launch a simulation, then use ros1_bridge to make the topics available in ros2.

After having the terminals open, let’s source ros1 noetic in the first terminal, and also source the workspace that contains the simulation:

source /opt/ros/noetic/setup.bash

source simulation_ws/devel/setup.bash

We can now launch the simulation with:

 roslaunch ur_e_gazebo ur3e.launch

The simulation should have been launched. You can ignore some error messages that appear for the moment. The logs should be something similar to the following:

Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
xacro: in-order processing became default in ROS Melodic. You can drop the option.
started roslaunch server http://4_xterm:32787/
SUMMARY
========
PARAMETERS
 * /gazebo/enable_ros_network: True
 * /gripper_controller/joint: egh_gripper_finge...
 * /gripper_controller/type: position_controll...
 * /joint_group_position_controller/joints: ['shoulder_pan_jo...
 * /joint_group_position_controller/type: position_controll...
 * /joint_state_controller/publish_rate: 50
 * /joint_state_controller/type: joint_state_contr...
 * /robot_description: <?xml version="1....
 * /robot_state_publisher/publish_frequency: 50.0
 * /robot_state_publisher/tf_prefix:
 * /rosdistro: noetic
 * /rosversion: 1.15.11
 * /scaled_pos_joint_traj_controller/action_monitor_rate: 10
 * /scaled_pos_joint_traj_controller/constraints/elbow_joint/goal: 0.1
 * /scaled_pos_joint_traj_controller/constraints/elbow_joint/trajectory: 0.1
 * /scaled_pos_joint_traj_controller/constraints/goal_time: 0.6
 * /scaled_pos_joint_traj_controller/constraints/shoulder_lift_joint/goal: 0.1
 * /scaled_pos_joint_traj_controller/constraints/shoulder_lift_joint/trajectory: 0.1
 * /scaled_pos_joint_traj_controller/constraints/shoulder_pan_joint/goal: 0.1
 * /scaled_pos_joint_traj_controller/constraints/shoulder_pan_joint/trajectory: 0.1
 * /scaled_pos_joint_traj_controller/constraints/stopped_velocity_tolerance: 0.05
 * /scaled_pos_joint_traj_controller/constraints/wrist_1_joint/goal: 0.1
 * /scaled_pos_joint_traj_controller/constraints/wrist_1_joint/trajectory: 0.1
 * /scaled_pos_joint_traj_controller/constraints/wrist_2_joint/goal: 0.1
 * /scaled_pos_joint_traj_controller/constraints/wrist_2_joint/trajectory: 0.1
 * /scaled_pos_joint_traj_controller/constraints/wrist_3_joint/goal: 0.1
 * /scaled_pos_joint_traj_controller/constraints/wrist_3_joint/trajectory: 0.1
 * /scaled_pos_joint_traj_controller/joints: ['shoulder_pan_jo...
 * /scaled_pos_joint_traj_controller/state_publish_rate: 25
 * /scaled_pos_joint_traj_controller/stop_trajectory_duration: 0.5
 * /scaled_pos_joint_traj_controller/type: position_controll...
 * /use_sim_time: True
NODES
  /
    arm_controller_spawner (controller_manager/controller_manager)
    controller_spawner_gripper (controller_manager/spawner)
    fake_joint_calibration (rostopic/rostopic)
    gazebo (gazebo_ros/gzserver)
    gazebo_gui (gazebo_ros/gzclient)
    joint_state_controller_spawner (controller_manager/controller_manager)
    robot_state_publisher (robot_state_publisher/robot_state_publisher)
    ros_control_controller_manager (controller_manager/controller_manager)
    spawn_gazebo_model (gazebo_ros/spawn_model)
auto-starting new master
process[master]: started with pid [3700]
ROS_MASTER_URI=http://4_xterm:11311
setting /run_id to 0e52b568-6cd6-11ec-9dc4-0242ac1b0007
process[rosout-1]: started with pid [3719]
started core service [/rosout]
process[gazebo-2]: started with pid [3722]
process[gazebo_gui-3]: started with pid [3724]
process[spawn_gazebo_model-4]: started with pid [3729]
process[robot_state_publisher-5]: started with pid [3733]
process[fake_joint_calibration-6]: started with pid [3741]
process[joint_state_controller_spawner-7]: started with pid [3742]
process[arm_controller_spawner-8]: started with pid [3747]
process[controller_spawner_gripper-9]: started with pid [3748]
process[ros_control_controller_manager-10]: started with pid [3749]
++ ls /usr/bin/gzclient-11.5.1
+ gzclient_path=/usr/bin/gzclient-11.5.1
+ DISPLAY=:2
+ /usr/bin/gzclient-11.5.1 -g /opt/ros/noetic/lib/libgazebo_ros_paths_plugin.so -g /opt/ros/noetic/lib/libgazebo_ros_api_plugin.so __name:=gazebo_gui __log:=/home/user/.ros/log/0e52b568-6cd6-11ec-9dc4-0242ac1b0007/gazebo_gui-3.log
INFO: cannot create a symlink to latest log directory: [Errno 2] No such file or directory: '/home/user/.ros/log/latest'
[INFO] [1641242725.661746, 0.000000]: Controller Spawner: Waiting for service controller_manager/load_controller
[INFO] [1641242725.761337, 0.000000]: Loading model XML from ros parameter robot_description
[INFO] [1641242725.768781, 0.000000]: Waiting for service /gazebo/spawn_urdf_model
Gazebo multi-robot simulator, version 11.5.1
Copyright (C) 2012 Open Source Robotics Foundation.
Released under the Apache 2 License.
http://gazebosim.org
[ INFO] [1641242726.627685197]: Finished loading Gazebo ROS API Plugin.
[ INFO] [1641242726.629782039]: waitForService: Service [/gazebo/set_physics_properties] has not been advertised, waiting...
[Msg] Waiting for master.
[Msg] Connected to gazebo master @ http://172.27.0.7:11345
[Msg] Publicized address: 172.27.0.7
[ INFO] [1641242726.738771830]: Finished loading Gazebo ROS API Plugin.
[ INFO] [1641242726.742342582]: waitForService: Service [/gazebo_gui/set_physics_properties] has not been advertised, waiting...
[Msg] Loading world file [/usr/share/gazebo-11/worlds/empty.world]
[ INFO] [1641242727.138178149]: waitForService: Service [/gazebo/set_physics_properties] is now available.
[ INFO] [1641242727.244679729, 0.048000000]: Physics dynamic reconfigure ready.
[INFO] [1641242727.280947, 0.088000]: Calling service /gazebo/spawn_urdf_model
[INFO] [1641242727.537885, 0.104000]: Spawn status: SpawnModel: Successfully spawned entity
[spawn_gazebo_model-4] process has finished cleanly
log file: /home/user/.ros/log/0e52b568-6cd6-11ec-9dc4-0242ac1b0007/spawn_gazebo_model-4*.log
[ INFO] [1641242729.504743470, 0.104000000]: Camera Plugin: Using the 'robotNamespace' param: '/'
[ INFO] [1641242729.513059783, 0.104000000]: Camera Plugin (ns = /)  <tf_prefix_>, set to ""
[ INFO] [1641242730.186888511, 0.104000000]: Loading gazebo_ros_control plugin
[ INFO] [1641242730.187188566, 0.104000000]: Starting gazebo_ros_control plugin in namespace: /
[ INFO] [1641242730.188738334, 0.104000000]: gazebo_ros_control plugin is waiting for model URDF in parameter [robot_description] on the ROS param server.
[ERROR] [1641242730.480595227, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/shoulder_pan_joint
[ERROR] [1641242730.482573352, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/shoulder_lift_joint
[ERROR] [1641242730.483960993, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/elbow_joint
[ERROR] [1641242730.485373065, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/wrist_1_joint
[ERROR] [1641242730.486771098, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/wrist_2_joint
[ERROR] [1641242730.488209294, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/wrist_3_joint
[ERROR] [1641242730.489539274, 0.104000000]: No p gain specified for pid.  Namespace: /gazebo_ros_control/pid_gains/egh_gripper_finger_left_joint
[ INFO] [1641242730.503064828, 0.104000000]: Loaded gazebo_ros_control.
[ INFO] [1641242730.514695030, 0.104000000]: MimicJointPlugin loaded! Joint: "egh_gripper_finger_left_joint", Mimic joint: "egh_gripper_finger_right_joint", Multiplier: 1, Offset: 0, MaxEffort: 200, Sensitiveness: 0
[Msg] Loading grasp-fix plugin
[Msg] GazeboGraspFix: Using disable_collisions_on_attach 0
[Msg] GazeboGraspFix: Using update rate 10
[Msg] GazeboGraspFix: Using max_grip_count 10
[Msg] GazeboGraspFix: Using grip_count_threshold 3
[Msg] GazeboGraspFix: Using release_tolerance 0.001
[Msg] GazeboGraspFix: Adding collision scoped name robot::egh_gripper_left_finger::egh_gripper_left_finger_collision
[Msg] GazeboGraspFix: Adding collision scoped name robot::egh_gripper_right_finger::egh_gripper_right_finger_collision
[Msg] Subscribing contact manager to topic ~/robot/contacts
[Msg] Advertising grasping events on topic grasp_events
[Wrn] [Publisher.cc:136] Queue limit reached for topic /gazebo/default/pose/local/info, deleting message. This warning is printed only once.
[INFO] [1641242730.834305, 0.401000]: Controller Spawner: Waiting for service controller_manager/switch_controller
[INFO] [1641242730.842565, 0.415000]: Controller Spawner: Waiting for service controller_manager/unload_controller
[INFO] [1641242730.846773, 0.422000]: Loading controller: gripper_controller
Loaded 'joint_state_controller'
[ERROR] [1641242731.068522252, 0.634000000]: Could not load controller 'arm_controller' because the type was not specified. Did you load the controller configuration on the parameter server (namespace: '/arm_controller')?
Error when loading 'arm_controller'
Loaded 'joint_group_position_controller'
[INFO] [1641242731.563780, 1.066000]: Controller Spawner: Loaded controllers: gripper_controller
[ERROR] [1641242731.564298128, 1.067000000]: Could not start controller with name 'arm_controller' because no controller with this name exists
Started ['joint_state_controller'] successfully
Error when starting ['arm_controller'] and stopping []
[INFO] [1641242731.573472, 1.076000]: Started controllers: gripper_controller

 

You can now click Open Gazebo to see the simulation:

Click Open Gazebo to see the ur3 simulation with depth camera

Click Open Gazebo to see the ur3 simulation with depth camera

We are most interested in the depth camera that we can see inside a red box in the image above.

That camera has a normal rgb camera and a depth camera that can be used to detect distance.

Launching ros1_bridge

Now that we have the simulation running in ros1, let’s start ros1_bridge so that we can interact with the topics using ros2.

For that, let’s run the following command (in this order, first ros1, then ros2) in the second terminal to source ros:

source /opt/ros/noetic/setup.bash

source /opt/ros/foxy/setup.bash

We can now launch ros1_bridge with:

ros2 run ros1_bridge dynamic_bridge --bridge-all-topics

The output should be similar to:

created 1to2 bridge for topic '/calibrated' with ROS 1 type 'std_msgs/Bool' and ROS 2type 'std_msgs/msg/Bool'
created 1to2 bridge for topic '/clock' with ROS 1 type 'rosgraph_msgs/Clock' and ROS 2 type 'rosgraph_msgs/msg/Clock'
created 1to2 bridge for topic '/gazebo/link_states' with ROS 1 type 'gazebo_msgs/LinkStates' and ROS 2 type 'gazebo_msgs/msg/LinkStates'
created 1to2 bridge for topic '/gazebo/model_states' with ROS 1 type 'gazebo_msgs/ModelStates' and ROS 2 type 'gazebo_msgs/msg/ModelStates'
created 1to2 bridge for topic '/gripper_controller/gripper_cmd/status' with ROS 1 type 'actionlib_msgs/GoalStatusArray' and ROS 2 type 'actionlib_msgs/msg/GoalStatusArray'
created 1to2 bridge for topic '/joint_states' with ROS 1 type 'sensor_msgs/JointState' and ROS 2 type 'sensor_msgs/msg/JointState'
created 1to2 bridge for topic '/rosout' with ROS 1 type 'rosgraph_msgs/Log' and ROS 2type 'rcl_interfaces/msg/Log'
created 1to2 bridge for topic '/rosout_agg' with ROS 1 type 'rosgraph_msgs/Log' and ROS 2 type 'rcl_interfaces/msg/Log'
created 1to2 bridge for topic '/tf' with ROS 1 type 'tf2_msgs/TFMessage' and ROS 2 type 'tf2_msgs/msg/TFMessage'
created 1to2 bridge for topic '/tf_static' with ROS 1 type 'tf2_msgs/TFMessage' and ROS 2 type 'tf2_msgs/msg/TFMessage'
created 1to2 bridge for topic '/wrist_rgbd/depth/camera_info' with ROS 1 type 'sensor_msgs/CameraInfo' and ROS 2 type 'sensor_msgs/msg/CameraInfo'
created 1to2 bridge for topic '/wrist_rgbd/depth/image_raw' with ROS 1 type 'sensor_msgs/Image' and ROS 2 type 'sensor_msgs/msg/Image'
created 1to2 bridge for topic '/wrist_rgbd/depth/points' with ROS 1 type 'sensor_msgs/PointCloud2' and ROS 2 type 'sensor_msgs/msg/PointCloud2'
created 1to2 bridge for topic '/wrist_rgbd/rgb/camera_info' with ROS 1 type 'sensor_msgs/CameraInfo' and ROS 2 type 'sensor_msgs/msg/CameraInfo'
created 1to2 bridge for topic '/wrist_rgbd/rgb/image_raw' with ROS 1 type 'sensor_msgs/Image' and ROS 2 type 'sensor_msgs/msg/Image'
created 1to2 bridge for topic '/wrist_rgbd/rgb/image_raw/compressed' with ROS 1 type 'sensor_msgs/CompressedImage' and ROS 2 type 'sensor_msgs/msg/CompressedImage'
created 1to2 bridge for topic '/wrist_rgbd/rgb/image_raw/compressedDepth' with ROS 1 type 'sensor_msgs/CompressedImage' and ROS 2 type 'sensor_msgs/msg/CompressedImage'
Created 2 to 1 bridge for service /gazebo/clear_body_wrenches
Created 2 to 1 bridge for service /gazebo/clear_joint_forces
Created 2 to 1 bridge for service /gazebo/delete_light
Created 2 to 1 bridge for service /gazebo/delete_model
Created 2 to 1 bridge for service /gazebo/get_joint_properties
Created 2 to 1 bridge for service /gazebo/get_light_properties
Created 2 to 1 bridge for service /gazebo/get_link_properties
Created 2 to 1 bridge for service /gazebo/get_link_state
Created 2 to 1 bridge for service /gazebo/get_model_properties
Created 2 to 1 bridge for service /gazebo/get_model_state
Created 2 to 1 bridge for service /gazebo/get_physics_properties
Created 2 to 1 bridge for service /gazebo/get_world_properties
Created 2 to 1 bridge for service /gazebo/pause_physics
Created 2 to 1 bridge for service /gazebo/reset_simulation
Created 2 to 1 bridge for service /gazebo/reset_world
Created 2 to 1 bridge for service /gazebo/set_joint_properties
Created 2 to 1 bridge for service /gazebo/set_link_properties
Created 2 to 1 bridge for service /gazebo/set_link_state
Created 2 to 1 bridge for service /gazebo/set_model_configuration
Created 2 to 1 bridge for service /gazebo/set_model_state
Created 2 to 1 bridge for service /gazebo/set_physics_properties
Created 2 to 1 bridge for service /gazebo/spawn_sdf_model
Created 2 to 1 bridge for service /gazebo/spawn_urdf_model
Created 2 to 1 bridge for service /gazebo/unpause_physics
Created 2 to 1 bridge for service /wrist_rgbd/set_camera_info
[INFO] [1641243318.565086017] [ros_bridge]: Passing message from ROS 1 std_msgs/Bool to ROS 2 std_msgs/msg/Bool (showing msg only once per type)
[INFO] [1641243318.565407501] [ros_bridge]: Passing message from ROS 1 rosgraph_msgs/Clock to ROS 2 rosgraph_msgs/msg/Clock (showing msg only once per type)
[INFO] [1641243318.634712568] [ros_bridge]: Passing message from ROS 1 gazebo_msgs/LinkStates to ROS 2 gazebo_msgs/msg/LinkStates (showing msg only once per type)
[INFO] [1641243318.636128359] [ros_bridge]: Passing message from ROS 1 gazebo_msgs/ModelStates to ROS 2 gazebo_msgs/msg/ModelStates (showing msg only once per type)
[INFO] [1641243318.636843101] [ros_bridge]: Passing message from ROS 1 actionlib_msgs/GoalStatusArray to ROS 2 actionlib_msgs/msg/GoalStatusArray (showing msg only once per type)
[INFO] [1641243318.637282514] [ros_bridge]: Passing message from ROS 1 sensor_msgs/JointState to ROS 2 sensor_msgs/msg/JointState (showing msg only once per type)
[INFO] [1641243318.637751048] [ros_bridge]: Passing message from ROS 1 rosgraph_msgs/Log to ROS 2 rcl_interfaces/msg/Log (showing msg only once per type)
[INFO] [1641243318.638771467] [ros_bridge]: Passing message from ROS 1 tf2_msgs/TFMessage to ROS 2 tf2_msgs/msg/TFMessage (showing msg only once per type)
created 2to1 bridge for topic '/calibrated' with ROS 2 type 'std_msgs/msg/Bool' and ROS 1 type 'std_msgs/Bool'
created 2to1 bridge for topic '/rosout' with ROS 2 type 'rcl_interfaces/msg/Log' and ROS 1 type 'rosgraph_msgs/Log'
removed 2to1 bridge for topic '/calibrated'
[INFO] [1641243320.970126882] [ros_bridge]: Passing message from ROS 1 sensor_msgs/Image to ROS 2 sensor_msgs/msg/Image (showing msg only once per type)
[INFO] [1641243320.971249839] [ros_bridge]: Passing message from ROS 1 sensor_msgs/CameraInfo to ROS 2 sensor_msgs/msg/CameraInfo (showing msg only once per type)
[INFO] [1641243321.032322768] [ros_bridge]: Passing message from ROS 2 rcl_interfaces/msg/Log to ROS 1 rosgraph_msgs/Log (showing msg only once per type)
[INFO] [1641243321.057171155] [ros_bridge]: Passing message from ROS 1 sensor_msgs/PointCloud2 to ROS 2 sensor_msgs/msg/PointCloud2 (showing msg only once per type)
[INFO] [1641243321.136791384] [ros_bridge]: Passing message from ROS 1 sensor_msgs/CompressedImage to ROS 2 sensor_msgs/msg/CompressedImage (showing msg only once per type)

If we now go to the third terminal, we can list the topics in ros2 with:

ros2 topic list

which would show something like this:

/calibrated
/clock
/gazebo/link_states
/gazebo/model_states
/gripper_controller/gripper_cmd/status
/joint_states
/parameter_events
/rosout
/rosout_agg
/tf
/tf_static
/wrist_rgbd/depth/camera_info
/wrist_rgbd/depth/image_raw
/wrist_rgbd/depth/points
/wrist_rgbd/rgb/camera_info
/wrist_rgbd/rgb/image_raw
/wrist_rgbd/rgb/image_raw/compressed
/wrist_rgbd/rgb/image_raw/compressedDepth

The topics we are most interested in are the ones that start with /wrist_rgbd:

/wrist_rgbd/depth/camera_info
/wrist_rgbd/depth/image_raw
/wrist_rgbd/depth/points
/wrist_rgbd/rgb/camera_info
/wrist_rgbd/rgb/image_raw
/wrist_rgbd/rgb/image_raw/compressed
/wrist_rgbd/rgb/image_raw/compressedDepth

Some important notes about the topics above.

  • The /wrist_rgbd/depth/camera_info topic does not show any distance of any object. Is just shows the configurations of the camera.
  • The /wrist_rgbd/rgb/image_raw just shows an array of pixels that just represent the colors that we can see in an image. It does not contain any distance information either.
  • The topic we are really interested in this one: /wrist_rgbd/depth/points

Let’s open rviz2 in a fourth terminal so that we can understand things better by visualizing data like the depth information of the camera:

rviz2

Rviz is a graphical tool, so, in order to see it, you have to click the Graphical tools button.

Open Graphical Tools / rviz

Open Graphical Tools / rviz

Now that rviz is open, let’s change the Fixed Frame to wrist_rgbd_camera_link

After that, let’s click the Add button that appears on the bottom left, and click the By Topic tab, and select /wrist_rgbd/depth/points

Add - By Topic - wrist_rgbd/depth/points 

Add – By Topic – wrist_rgbd/depth/points

You should now be able to see the depth info visually:

Add - By Topic - wrist_rgbd depth points visually

Add – By Topic – wrist_rgbd depth points visually

The points you see in the last image above actually tell you the distance of the arm.

By checking the values that are sent in the /wrist_rgbd/depth/points topic you can easily determine how far an object is.

That is pretty much how you would use a depth camera to determine the distance of an object.

Youtube video

So this is the post for today. Remember that we have the live version of this post on YouTube. If you liked the content, please consider subscribing to our youtube channel. We are publishing new content ~every day.

Keep pushing your ROS Learning.

Pin It on Pinterest