[ROS Q&A] 195 – How to know if robot has moved one meter using Odometry

[ROS Q&A] 195 – How to know if robot has moved one meter using Odometry

Learn how to know if the robot has moved one-meter using Odometry.  You’ll learn how to:

  • create a Python program for calculating the distance moved by the robot using the Odometry topic and publish it into a topic.
  • create a Python program for testing that we can move the robot depending on the Odometry data.

Let’s go!


Related Resources


Step 1: Get your development environment ready and grab a copy of the code

  1. Click here to grab a copy of the ROSject already containing the project. This requires an account on the ROS Development Studio (ROSDS), an online platform for developing for ROS within a PC browser. If you don’t have an account, you will be asked to create one.
  2. To open a “terminal” on ROSDSpick the Shell app from the Tools menu.
  3. You can find the IDE app on the Tools menu.

Step 2: Start a Simulation

We are going to use the TurtleBot 2 simulation.

  1. Click on Simulations from the main menu.
  2. Under Launch a provided simulation, leave “Empty world” selected and click on Choose a robot.
  3. Type “turtle” into the search box and select TurtleBot 2.
  4. Click Start Simulation.
  5. A window showing the TurtleBot 2 in an empty world should show up.

Launch a provided sim

Step 3: Examine the structure of Odometry messages

Let’s see the structure of an Odom message. Open a terminal and run:

user:~$ rostopic echo /odom -n1
header:
  seq: 14929
  stamp:
    secs: 748
    nsecs: 215000000
  frame_id: "odom"
child_frame_id: "base_footprint"
pose:
  pose:
    position:
      x: 0.00668370211388
      y: 0.00010960687178
      z: -0.000246865753431
    orientation:
      x: 0.000389068511716
      y: -0.00720626927928
      z: 0.000354481463316
      w: 0.999973895985
  covariance: [1e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000000000.0, 0.0, 0.0, 0.0, 0.0,0.0, 0.0, 1000000000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001]
twist:
  twist:
    linear:
      x: 2.30945682841e-05
      y: 0.000390977104083
      z: 0.0
    angular:
      x: 0.0
      y: 0.0
      z: -0.000507764995516
  covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---

The part of the message that stores the current position of the robot is the pose.pose.position:

user:~$ rostopic echo /odom -n1
# ...
pose:
  pose:
    position:
      x: 0.00668370211388
      y: 0.00010960687178
      z: -0.000246865753431
# ...

This part is what is used in the Python script that calculates the distance moved.

Step 4: Understand the Python scripts

1. Open the IDE and browse to the src folder. You will find two packages odom_movement_detector and test_movement. Each of these projects contains a Python script, each explained with comments in the code

odom_movement_detector/src/odom_movement_detector.py

#!/usr/bin/env python

import rospy
import math
from nav_msgs.msg import Odometry
from geometry_msgs.msg import Point
from std_msgs.msg import Float64

class MovementDetector(object):
    def __init__(self):
        """Initialize an object of the MovementDetector class."""
        # _mved_distance is for stored distance moved
        # create and initialize it here. Initial value is 0.0
        self._mved_distance = Float64()
        self._mved_distance.data = 0.0

        # Get the inital position. This will be a reference point for calculating
        # the distance moved 
        self.get_init_position()

        # Create a publisher for publishing the distance moved into the topic '/moved_distance'
        self.distance_moved_pub = rospy.Publisher('/moved_distance', Float64, queue_size=1)

        # create a subscriber for getting new Odometry messages
        rospy.Subscriber("/odom", Odometry, self.odom_callback)

    def get_init_position(self):
        """Get the initial position of the robot."""
        """
        Structure of the odom position message:
        user:~$ rostopic echo /odom -n1
        header:
        seq: 14929
        stamp:
            secs: 748
            nsecs: 215000000
        frame_id: "odom"
        child_frame_id: "base_footprint"
        pose:
        pose:
            position:
            x: 0.00668370211388
            y: 0.00010960687178
            z: -0.000246865753431
        """
        data_odom = None
        # wait for a message from the odometry topic and store it in data_odom when available
        while data_odom is None:
            try:
                data_odom = rospy.wait_for_message("/odom", Odometry, timeout=1)
            except:
                rospy.loginfo("Current odom not ready yet, retrying for setting up init pose")
        
        # Store the received odometry "position" variable in a Point instance 
        self._current_position = Point()
        self._current_position.x = data_odom.pose.pose.position.x
        self._current_position.y = data_odom.pose.pose.position.y
        self._current_position.z = data_odom.pose.pose.position.z

    def odom_callback(self, msg):
        """Process odometry data sent by the subscriber."""
        # Get the position information from the odom message
        # See the structure of an /odom message in the `get_init_position` function
        NewPosition = msg.pose.pose.position

        # Calculate the new distance moved, and add it to _mved_distance and 
        self._mved_distance.data += self.calculate_distance(NewPosition, self._current_position)
        
        # Update the current position of the robot so we have a new reference point
        # (The robot has moved and so we need a new reference for calculations)
        self.updatecurrent_positin(NewPosition)
        
        # If distance moved is big enough, publish it to the designated topic
        # Otherwise publish zero
        if self._mved_distance.data < 0.000001:
            aux = Float64()
            aux.data = 0.0
            self.distance_moved_pub.publish(aux)
        else:
            self.distance_moved_pub.publish(self._mved_distance)

    def updatecurrent_positin(self, new_position):
        """Update the current position of the robot."""
        self._current_position.x = new_position.x
        self._current_position.y = new_position.y
        self._current_position.z = new_position.z

    def calculate_distance(self, new_position, old_position):
        """Calculate the distance between two Points (positions)."""
        x2 = new_position.x
        x1 = old_position.x
        y2 = new_position.y
        y1 = old_position.y
        dist = math.hypot(x2 - x1, y2 - y1)
        return dist

    def publish_moved_distance(self):
        # spin() simply keeps python from exiting until this node is stopped
        rospy.spin()

if __name__ == '__main__':
    # create a node for running the program
    rospy.init_node('movement_detector_node', anonymous=True)

    # create an instance of the MovementDetector class and set the code
    # in motion
    movement_obj = MovementDetector()
    movement_obj.publish_moved_distance()

test_movement/src/test_movement.py

#!/usr/bin/env python

import rospy
from geometry_msgs.msg import Twist
from std_msgs.msg import Float64

def my_callback(msg):
    """Callback function that processes messages from the subscriber."""

    # get the distance moved from the message
    distance_moved = msg.data

    # If distance is less than 2, continue moving the robot
    # Otherwise, stop it (by pubishing `0`)
    if msg.data < 2:
        move.linear.x = 0.1

    if msg.data > 2:
        move.linear.x = 0

    pub.publish(move)

# create a node for running the program
rospy.init_node('test_movement')

# create a subscriber that gets the distance moved
sub = rospy.Subscriber('/moved_distance', Float64, my_callback)

# Create a publisher that moves the robot
pub = rospy.Publisher('/cmd_vel', Twist, queue_size="1")

# Create a global variable for publising a Twist ("cmd_vel") message 
move = Twist()

# Keep the program running
rospy.spin()

Step 5: See the Python scripts in action

Open a terminal and run the movement detector:

user:~$ rosrun odom_movement_detector odom_movement_detector.py

Open another terminal and check the distance being published to the /moved_distance:

user:~$ rostopic echo /moved_distance
data: 0.00280330822873
---
data: 0.00280530307334

You can see from above that the distance is not changing (more or less) because the robot is stationary. Now let’s move the robot:

user:~$ rosrun test_movement test_movement.py

Now you should see the robot moving and moved_distance increasing. It should stop when moved_instance > 2.0.

user:~$ rostopic echo /moved_distance
data: 0.00280330822873
---
data: 0.00280330822873
---
data: 0.00280330822873
---
data: 0.00280330822873
---
data: 0.00280330822873
---
# ...
data: 1.00280330822873
---
# ...
data: 2.00280330822873
---

And that’s it. I hope you had some fun!

Extra: Video of the post

Here below you have a “sights and sounds” version of this post, just in case you prefer it that way. Enjoy!

Feedback

Did you like this post? Do you have any questions about the explanations? Whatever the case, please leave a comment on the comments section below, so we can interact and learn from each other.

If you want to learn about other ROS or ROS2 topics, please let us know in the comments area and we will do a video or post about it.


Edited by Bayode Aderinola

[ROS Q&A] 194 – Add Pressure sensors in RVIZ

[ROS Q&A] 194 – Add Pressure sensors in RVIZ

What will you learn in this post

  • How to connect world frame to base_link of DogBot
  • How to publish pressure sensors Markers for RVIZ

List of resources used in this post

Overview

In the previous post, we learned how to add pressure sensors in a gazebo simulation. Now we are going to learn how to see then on RViz (ROS Visualization tool). Remember that there is a question on ROS Answers about contact sensors, which is what originated this post.

Start the provided ROSject

The first thing you need to do is to have a copy of the ROSject we mentioned above if you want everything ready to go. You can also clone the git repository aforementioned if you want to run everything in your local computer. If you for the ROSject option, just click on the ROSject link to get an automatic copy. You should now see a ROSject called DogBotTactileSensors on your list of ROSjects, something like the image below:

DogBot tactile sensors in ROSDS

DogBot tactile sensors in ROSDS

After clicking on the Open button to open the ROSject, you should have the ROSject opened in a remote computer launched on ROSDS.

Launching the simulation

If you remember the previous post, we can launch the simulation by running roslaunch dogbot_gazebo main.launch  on the webshell. You can also use the menu by clicking on Simulations -> Choose Simulation -> Choose Launch File and selecting the main.launch file in the dogbot_gazebo package:

dogbot_gazebo main.launch in ROSDS

dogbot_gazebo main.launch in ROSDS

You should now have DogBot up and running in ROSDS:

dogbot robot running in ROSDS

Adding the sensors to RViz

Let’s start by stopping the robot. For that, we can just run the keyboard teleoperation with rosrun teleop_twist_keyboard teleop_twist_keyboard.py cmd_vel:=/dogbot/cmd_vel  to move DogBot and then hit K to stop it.

Now we open a new shell by clicking on Tools -> Shell and run:

rosrun dogbot_markers arrows_rviz.py

This command essentially get the data published by the sensors and convert then into markers that can be shown in RViz.

Now, to see RViz we have to launch by running the command below in a new shell:

rosrun rviz rviz

This will open RViz, but in order to see it, you have to open the Graphical Tools by clicking Tools -> Graphical Tools.

Launching Graphical Tools in ROSDS

Launching Graphical Tools in ROSDS

Once you are in RViz, you open a config file by clicking File -> Open Config. There you select the file dogbot.rviz file located in ~/simulation_ws/src/dogbot_tc/dogbot_markers/rviz/dogbot.rviz

Select the rviz config file for dogbot

Select the rviz config file dogbot.rviz for dogbot

The file you open has everything configured to show the contact markers in RViz.

The pressure in each of the feet of Dogbit is represented by an arrow, which lengths and colors are proportional to the pressure registered. You can see the arrows changing their lengths when the robot steps.

How the markers are created

On the folder ~/simulation_ws/src/dogbot_tc/dogbot_markers/scripts we created a script called world_to_base_tf_publisher.py with the following content:

#! /usr/bin/env python
import rospy
import time
import tf
from nav_msgs.msg import Odometry
from geometry_msgs.msg import Pose


class WorldBaseTFPublisher(object):

    def __init__(self):
        rospy.loginfo("Start init WorldBaseTFPublisher Class")
        self._br = tf.TransformBroadcaster()
        rospy.loginfo("set up tf.TransformBroadcaster DONE")
        self._current_pose = Pose()
        rospy.loginfo("_current_pose DONE")
        self.get_init_position()
        rospy.loginfo("get_init_position DONE")
        self._sub = rospy.Subscriber('/dogbot/odom', Odometry, self.odom_callback)
        rospy.loginfo("self._sub DONE")

    def get_init_position(self):
        data_odom = None
        r = rospy.Rate(2)
        while data_odom is None and not rospy.is_shutdown():
            try:
                data_odom = rospy.wait_for_message("/dogbot/odom", Odometry, timeout=1)
            except:
                rospy.loginfo("Current odom not ready yet, retrying for setting up init pose")
                try:
                    r.sleep()
                except rospy.ROSInterruptException:
                    # This is to avoid error when world is rested, time when backwards.
                    pass

        self._current_pose = data_odom.pose.pose

    def odom_callback(self, msg):
        self._current_pose = msg.pose.pose

    def get_current_pose(self):
        return self._current_pose

    def handle_turtle_pose(self, pose_msg, link_name, world_name = "/world"):

        self._br.sendTransform(
                                (pose_msg.position.x,
                                 pose_msg.position.y,
                                 pose_msg.position.z),
                                (pose_msg.orientation.x,
                                 pose_msg.orientation.y,
                                 pose_msg.orientation.z,
                                 pose_msg.orientation.w),
                                rospy.Time.now(),
                                link_name,
                                world_name)

    def publisher_of_tf(self):

        frame_link_name = "base_link"
        time.sleep(1)
        rospy.loginfo("Ready..Starting to Publish TF data now...")

        rate = rospy.Rate(50)
        while not rospy.is_shutdown():
            pose_now = self.get_current_pose()
            if not pose_now:
                print "The Pose is not yet available...Please try again later"
            else:
                self.handle_turtle_pose(pose_now, frame_link_name)
            try:
                rate.sleep()
            except rospy.ROSInterruptException:
                # This is to avoid error when world is rested, time when backwards.
                pass


if __name__ == '__main__':
    rospy.init_node('publisher_of_world_base_tf_node', anonymous=True)
    rospy.loginfo("STARTING WORLS TO BASE TF PUBLISHER...")
    world_base_tf_pub = WorldBaseTFPublisher()
    world_base_tf_pub.publisher_of_tf()

In this file we get the /dogbot/odom topic, which informs where is the robot in the world, and publishes the tf from world to base_link. With the command rosrun tf view_frames, we generate the frames.pdf file. By downloading that file we can see that the world is connected to base_link, which is the base of the robot.

tf world connected to base_link frame on ROSDS

tf world connected to base_link frame on ROSDS

The connection between the two frames is important because the sensors are represented in the world frame, but are detected by the robot, which is represented by the base_link. If you want to learn more about tf, please consider taking the course TF ROS 101 in Robot Ignite Academy.

The file world_to_base_tf_publisher.py is automatically launched by put_robot_in_world.launch when we launch the main.launch file.

How the markers are published in RViz

The markers are created in the script ~/simulation_ws/src/dogbot_tc/dogbot_markers/scripts/arrows_rviz.py, which has the following content:

#!/usr/bin/env python

import rospy
from visualization_msgs.msg import Marker
from geometry_msgs.msg import Point
import tf
import numpy
from std_msgs.msg import String
from gazebo_msgs.msg import ContactsState
import math

class MarkerBasics(object):
    def __init__(self, topic_id):
        marker_topic = '/marker_basic_'+topic_id
        self.marker_objectlisher = rospy.Publisher(marker_topic, Marker, queue_size=1)
        self.rate = rospy.Rate(25)
        self.init_marker(index=0)

    def init_marker(self, index=0):
        self.marker_object = Marker()
        self.change_frame(frame="/world", ns="dogbot", index=0)
        self.marker_object.type = Marker.ARROW
        self.marker_object.action = Marker.ADD

        self.change_position(x=0.0, y=0.0, z=0.0)
        self.change_orientation(pitch=0.0, yaw=0.0)
        self.change_scale()
        self.change_colour(R=1.0, G=0.0, B=0.0)

        # If we want it for ever, 0, otherwise seconds before desapearing
        self.marker_object.lifetime = rospy.Duration(0)

    def change_orientation(self, pitch, yaw):
        """
        Roll doesnt make any sense in an arrow
        :param pitch: Up Down. We clip it to values [-1.5708,1.5708]
        :param yaw: Left Right , No clamp
        :return:
        """
        pitch = numpy.clip(pitch, -1.5708,1.5708)

        q = tf.transformations.quaternion_from_euler(0.0, pitch, yaw)

        self.marker_object.pose.orientation.x = q[0]
        self.marker_object.pose.orientation.y = q[1]
        self.marker_object.pose.orientation.z = q[2]
        self.marker_object.pose.orientation.w = q[3]

    def change_position(self, x, y, z):
        """
        Position of the starting end of the arrow
        :param x:
        :param y:
        :param z:
        :return:
        """

        my_point = Point()
        my_point.x = x
        my_point.y = y
        my_point.z = z
        self.marker_object.pose.position = my_point
        #rospy.loginfo("PositionMarker-X="+str(self.marker_object.pose.position.x))

    def change_colour(self, R, G, B):
        """
        All colours go from [0.0,1.0].
        :param R:
        :param G:
        :param B:
        :return:
        """

        self.marker_object.color.r = R
        self.marker_object.color.g = G
        self.marker_object.color.b = B
        # This has to be, otherwise it will be transparent
        self.marker_object.color.a = 1.0

    def change_scale(self, s_x=1.0, s_y=0.1, s_z=0.1):
        """

        :param s_x:
        :param s_y:
        :param s_z:
        :return:
        """

        self.marker_object.scale.x = s_x
        self.marker_object.scale.y = s_y
        self.marker_object.scale.z = s_z

    def start(self):
        pitch = -0.7
        yaw = 0.0
        s_x = 1.0

        while not rospy.is_shutdown():
            #self.change_orientation(pitch=pitch,yaw=yaw)
            self.change_scale(s_x=s_x)
            self.marker_objectlisher.publish(self.marker_object)
            self.rate.sleep()
            s_x -= 0.01

    def translate(self, value, leftMin, leftMax, rightMin, rightMax):
        # Figure out how 'wide' each range is
        leftSpan = leftMax - leftMin
        rightSpan = rightMax - rightMin

        # Convert the left range into a 0-1 range (float)
        valueScaled = float(value - leftMin) / float(leftSpan)

        # Convert the 0-1 range into a value in the right range.
        return rightMin + (valueScaled * rightSpan)

    def pressure_to_wavelength_to_rgb(self, pressure, min_pressure=-50.0, max_pressure=50.0, gamma=0.8):

        '''This converts a given wavelength of light to an
        approximate RGB color value. The wavelength must be given
        in nanometers in the range from 380 nm through 750 nm
        (789 THz through 400 THz).

        Based on code by Dan Bruton
        http://www.physics.sfasu.edu/astro/color/spectra.html
        '''

        wavelength = self.translate(value=pressure,
                                   leftMin=min_pressure, leftMax=max_pressure,
                                   rightMin=380, rightMax=750)

        wavelength = float(wavelength)

        rospy.logdebug("pressure=" + str(pressure))
        rospy.logdebug("wavelength="+str(wavelength))

        if wavelength >= 380 and wavelength <= 440:
            attenuation = 0.3 + 0.7 * (wavelength - 380) / (440 - 380)
            R = ((-(wavelength - 440) / (440 - 380)) * attenuation) ** gamma
            G = 0.0
            B = (1.0 * attenuation) ** gamma
        elif wavelength >= 440 and wavelength <= 490:
            R = 0.0
            G = ((wavelength - 440) / (490 - 440)) ** gamma
            B = 1.0
        elif wavelength >= 490 and wavelength <= 510:
            R = 0.0
            G = 1.0
            B = (-(wavelength - 510) / (510 - 490)) ** gamma
        elif wavelength >= 510 and wavelength <= 580:
            R = ((wavelength - 510) / (580 - 510)) ** gamma
            G = 1.0
            B = 0.0
        elif wavelength >= 580 and wavelength <= 645:
            R = 1.0
            G = (-(wavelength - 645) / (645 - 580)) ** gamma
            B = 0.0
        elif wavelength >= 645 and wavelength <= 750:
            attenuation = 0.3 + 0.7 * (750 - wavelength) / (750 - 645)
            R = (1.0 * attenuation) ** gamma
            G = 0.0
            B = 0.0
        else:
            R = 0.0
            G = 0.0
            B = 0.0

        return R, G, B

    def change_frame(self, frame="/world", ns="dogbot", index=0):
        """

        :param frame:
        :return:
        """

        self.marker_object.header.frame_id = frame
        self.marker_object.header.stamp = rospy.get_rostime()
        self.marker_object.ns = ns
        self.marker_object.id = index


    def update_marker(self, frame, ns, index, position, orientation, pressure, min_pressure=0.0, max_pressure=10.0):
        """

        :param position: [X,Y,Z] in the world frame
        :param pressure: Magnitude
        :param orientation: [Pitch,Yaw]
        :return:
        """
        #self.change_frame(frame=frame, ns=ns, index=index)
        self.change_position(x=position[0], y=position[1], z=position[2])
        self.change_orientation(pitch=orientation[0], yaw=orientation[1])
        self.change_scale(s_x = pressure)

        R,G,B = self.pressure_to_wavelength_to_rgb(pressure=pressure,
                                                   min_pressure=min_pressure,
                                                   max_pressure=max_pressure,
                                                   gamma=0.8)

        rospy.logdebug("R,G,B=["+str(R)+", "+str(G)+", "+str(B)+"]")

        self.change_colour(R=R, G=G, B=B)

        self.marker_objectlisher.publish(self.marker_object)



class FootPressureInfo(object):

    def __init__(self):

        self.min_pressure = 0.0
        self.max_pressure = 2.0
        self.markerbasics_object_front_left_foot = MarkerBasics(topic_id="front_left_foot")
        self.markerbasics_object_front_right_foot = MarkerBasics(topic_id="front_right_foot")
        self.markerbasics_object_back_left_foot = MarkerBasics(topic_id="back_left_foot")
        self.markerbasics_object_back_right_foot = MarkerBasics(topic_id="back_right_foot")

        rospy.Subscriber("/dogbot/back_left_contactsensor_state", ContactsState, self.contact_callback_back_left_foot)
        rospy.Subscriber("/dogbot/back_right_contactsensor_state", ContactsState, self.contact_callback_back_right_foot)
        rospy.Subscriber("/dogbot/front_left_contactsensor_state", ContactsState, self.contact_callback_front_left_foot)
        rospy.Subscriber("/dogbot/front_right_contactsensor_state", ContactsState, self.contact_callback_front_right_foot)


    def contact_callback_front_right_foot(self, data):
        """

        :param data:
        :return:
        """
        foot_name = data.header.frame_id

        if len(data.states) >= 1:
            Fx = data.states[0].total_wrench.force.x
            Fy = data.states[0].total_wrench.force.y
            Fz = data.states[0].total_wrench.force.z
            pressure = math.sqrt(pow(Fx,2)+pow(Fy,2)+pow(Fz,2))

            Px = data.states[0].contact_positions[0].x
            Py = data.states[0].contact_positions[0].y
            Pz = data.states[0].contact_positions[0].z

            pressure = pressure / 100.0
            # rospy.loginfo(str(foot_name) + "--->pressure =" + str(pressure))
            # rospy.loginfo(str(foot_name) + "Point =[" + str(pressure))

            index = 1

            self.markerbasics_object_front_right_foot.update_marker(frame=foot_name,
                                                   ns="dogbot",
                                                   index=index,
                                                   position=[Px, Py, Pz],
                                                   orientation=[-1.57, 0.0],
                                                   pressure=pressure,
                                                   min_pressure=self.min_pressure,
                                                   max_pressure=self.max_pressure)


        else:
            # No Contact
            pass


    def contact_callback_front_left_foot(self, data):
        """

        :param data:
        :return:
        """
        foot_name = data.header.frame_id

        if len(data.states) >= 1:
            Fx = data.states[0].total_wrench.force.x
            Fy = data.states[0].total_wrench.force.y
            Fz = data.states[0].total_wrench.force.z
            pressure = math.sqrt(pow(Fx,2)+pow(Fy,2)+pow(Fz,2))

            Px = data.states[0].contact_positions[0].x
            Py = data.states[0].contact_positions[0].y
            Pz = data.states[0].contact_positions[0].z

            pressure = pressure / 100.0


            index = 0

            self.markerbasics_object_front_left_foot.update_marker(frame=foot_name,
                                                   ns="dogbot",
                                                   index=index,
                                                   position=[Px, Py, Pz],
                                                   orientation=[-1.57, 0.0],
                                                   pressure=pressure,
                                                   min_pressure=self.min_pressure,
                                                   max_pressure=self.max_pressure)
        else:
            # No Contact
            pass

    def contact_callback_back_right_foot(self, data):
        """

        :param data:
        :return:
        """
        foot_name = data.header.frame_id

        if len(data.states) >= 1:
            Fx = data.states[0].total_wrench.force.x
            Fy = data.states[0].total_wrench.force.y
            Fz = data.states[0].total_wrench.force.z
            pressure = math.sqrt(pow(Fx,2)+pow(Fy,2)+pow(Fz,2))

            Px = data.states[0].contact_positions[0].x
            Py = data.states[0].contact_positions[0].y
            Pz = data.states[0].contact_positions[0].z

            pressure = pressure / 100.0
            #rospy.loginfo(str(foot_name) + "--->pressure =" + str(pressure))
            # rospy.loginfo(str(foot_name) + "Point =[" + str(pressure))

            index = 2

            self.markerbasics_object_back_right_foot.update_marker(frame=foot_name,
                                                   ns="dogbot",
                                                   index=index,
                                                   position=[Px, Py, Pz],
                                                   orientation=[-1.57, 0.0],
                                                   pressure=pressure,
                                                   min_pressure=self.min_pressure,
                                                   max_pressure=self.max_pressure)


        else:
            # No Contact
            pass


    def contact_callback_back_left_foot(self, data):
        """

        :param data:
        :return:
        """
        foot_name = data.header.frame_id

        if len(data.states) >= 1:
            Fx = data.states[0].total_wrench.force.x
            Fy = data.states[0].total_wrench.force.y
            Fz = data.states[0].total_wrench.force.z
            pressure = math.sqrt(pow(Fx,2)+pow(Fy,2)+pow(Fz,2))

            Px = data.states[0].contact_positions[0].x
            Py = data.states[0].contact_positions[0].y
            Pz = data.states[0].contact_positions[0].z

            pressure = pressure / 100.0

            index = 3

            self.markerbasics_object_back_left_foot.update_marker(frame=foot_name,
                                                   ns="dogbot",
                                                   index=index,
                                                   position=[Px, Py, Pz],
                                                   orientation=[-1.57, 0.0],
                                                   pressure=pressure,
                                                   min_pressure=self.min_pressure,
                                                   max_pressure=self.max_pressure)
        else:
            # No Contact
            pass




if __name__ == '__main__':
    rospy.init_node('footpressure_marker_node', anonymous=True)
    footpressure_object = FootPressureInfo()
    rospy.spin()

When the file is launched, 4 markers are created, each of them publishing on a specific topic. You can find the topic for each marker with rostopic list | grep marker.

We then create the subscribers for the topics, each of then calling a callback function when a topic is published. On the callback we calculate the pressure, get the positions, then publish the marker on RViz.

A video version of this post

So this is the post for today. If you prefer, we also have a video version of this post available in the link below. We hope you liked the post and the video. If so, please feel free to subscribe to our channel, share this post and comment on the video section.

We would like to thank React Robotics for this amazing robot and thank you for reading through this post.

Keep pushing your ROS Learning.

Remember, if you want to learn more about markers in RViz, we highly recommend you taking this course on ROS RViz Advanced Markers.

Interested to learn more? Try this course on URDF creation for Gazebo and ROS:

Robot Create with URDF

#dogbot #quadruped #pressuresensors #tactilesensors #rviz #Simulation #Gazebo #Robot

[ROS Q&A] 193 – How to load a pre-built octomap into MoveIt

[ROS Q&A] 193 – How to load a pre-built octomap into MoveIt

About

Learn how to load a pre-built octomap into MoveIt!.

You’ll learn:

  • Create a Python program for loading an octomap into MoveIt
  • Set Up MoveIt to detect this octomap

RELATED ROS RESOURCES&LINKS:

ROSject —▸ http://www.rosject.io/l/c52f64c/
ROS Development Studio (ROSDS) —▸ http://rosds.online
Robot Ignite Academy –▸ https://www.robotigniteacademy.com

Related Courses

ROS Industrial robots Course Cover - ROS Online Courses - Robot Ignite Academy

ROS for Industrial Robots

ROS Manipulation Course Cover - ROS Online Courses - Robot Ignite Academy

ROS Manipulation Course


Feedback

Did you like this video? Do you have questions about what is explained? Whatever the case, please leave a comment on the comments section below, so we can interact and learn from each other.

If you want to learn about other ROS topics, please let us know on the comments area and we will do a video about it 🙂

#ROS #Robot #ROStutorials #Moveit #Octomap

[ROS Q&A] 192 – Add Pressure sensors in Gazebo Simulation for DogBot

[ROS Q&A] 192 – Add Pressure sensors in Gazebo Simulation for DogBot

What will you learn in this post

  • How to add pressure sensors in gazebo simulation using the DogBot robot.
  • How to convert XACRO into SDF

List of resources used in this post

Start the provided ROSject

The first thing you need to do is to have a copy of the ROSject we mentioned above. Click on the link to get an automatic copy. You should now see a ROSject called DogBotTactileSensors on your list of ROSjects, something like the image below:

DogBot tactile sensors in ROSDS

DogBot tactile sensors in ROSDS

After clicking on the Open button to open the ROSject, you should have the ROSject opened in a remote computer launched on ROSDS.

Launching the simulation

Once the ROSject is open, you can launch a simulation. You can achieve that in two ways:

  1. Opening a web shell (clicking on Tools -> Web Shell) and typing: roslaunch dogbot_gazebo main.launch
  2. Using the menu Simulations -> Choose Simulation -> Choose Launch File. In the end, you will have a menu with a list of launch files. You can select the main.launch file in the dogbot_gazebo package as shown in the image below:
dogbot_gazebo main.launch in ROSDS

dogbot_gazebo main.launch in ROSDS

You should now have DogBot up and running in ROSDS:

dogbot robot running in ROSDS

dogbot robot running in ROSDS

In the simulation, you can see the robot has red “shoes” in its feet. There is where we put our pressure sensors.

Launching the simulation in your own local computer

The steps aforementioned are about how to launch the simulation in ROSDS (ROS Development Studio)

If you want to install in your local computer, you can just download the ROSject (or git clone) and execute requirements.sh script, which is located in the simulation_ws/src/dogbot_tc folder with the commands below:

cd simulation_ws/src/dogbot_tc

./requirements.sh

After that, you should be able to launch the simulation with:

  1. source /opt/ros/kinetic/setup.bash
  2. source simulation_ws/devel/setup.bash
  3. roslaunch dogbot_gazebo main.launch

Listing the ROS Topics

Once you have the simulation up and running, you can see the contact sensors with:

rostopic list | grep contact

We can see that the topics are working by subscribing to a contact topic, using the command below:

rostopic echo /dogbot/back_left_contactsensor_state -n1

If we look carefully at the data printed, we can see the frame_id “back_left_foot” and in the states section, you can see the objects the robot is in contact with and the forces applied.

Finding the code

The code used to define the sensors is located in ~/simulation_ws/src/dogbot_tc/dogbot_description/urdf/dogbot.xacro

In ROSDS you can see it easily by using the Code Editor.

dogbot.xacro robot in ROSD

dogbot.xacro robot in ROSD

By looking at the code, if you find for “Contact Sensor” you will find it around line 350, inside the <gazebo> tag of the dogbot.xacro file.

Going deeper into the code

On the dogbot.xacro file, the sensor is defined in line 350 like below:

<sensor name="${prefix}_${suffix}_contactsensor_sensor" type="contact">
        <always_on>true</always_on>
        <contact>
          <collision>${prefix}_${suffix}_lowerleg_fixed_joint_lump__${prefix}_${suffix}_foot_collision_5</collision>
        </contact>
        <plugin name="${prefix}_${suffix}_foot_plugin" filename="libgazebo_ros_bumper.so">
          <bumperTopicName>${prefix}_${suffix}_contactsensor_state</bumperTopicName>
          <!--<frameName>${prefix}_${suffix}_foot</frameName>-->
          <frameName>world</frameName>
        </plugin>
      </sensor>

We can see the type=”contact”, and the gazebo plugin named libgazebo_ros_bumper.so.

One of the most important parts of the code is the contact part:

<contact>
  <collision>${prefix}_${suffix}_lowerleg_fixed_joint_lump__${prefix}_${suffix}_foot_collision_5</collision>
</contact>

Further details about the collision can be found in this video: https://youtu.be/RcrgC9o9A6A?t=269

Converting .xacro into .sdf file

Gazebo uses .sdf files to launch simulations, but the code we showed so far is a .xacro one (dogbot.xacro).

In order to convert .xacri into .sdf you can just:

cd ~/simulation_ws/src/dogbot_tc/dogbot_description/urdf

./xacro_to_sdf.sh

That script generates the dogbot.sdf file that is already in the ROSject. The content of the xacro_to_sdf.sh file is:

#!/usr/bin/env bash
echo "Cleaning sdf and urdf"
rm -rf dogbot.sdf dogbot.urdf
rosrun xacro xacro.py dogbot.xacro > dogbot.urdf
gz sdf -p dogbot.urdf > dogbot.sdf
echo "Generated SDF

Seeing the markers on RViz

You can see the contact sensors in RViz. For that you can run the following command in a webshell:

rosrun dogbot_markers arrows_rviz.py

In a different shell you run:

rosrun rviz rviz

Open the Graphical Tools and select the rviz file inside the ~/simulation_ws/src/dogbot_tc/dogbot_markers/rviz folder. For that you first need to click Tools -> Graphical Tools, then select the rviz config file.

Launching Graphical Tools in ROSDS

Launching Graphical Tools in ROSDS

Select the rviz config file for dogbot

Select the rviz config file for dogbot

 

dogbot rviz sensors on ROSDS

dogbot rviz sensors on ROSDS

Here the pressure in each of the feet of Dogbit is represented by an arrow, which lengths and colors are proportional to the pressure registered.

If you want a deeper understanding of how the markers are created and published in RViz, please have a look at the post on adding pressure sensors in RViz.

How to move DogBot

To move the robot you can easily use the command below:
rosrun teleop_twist_keyboard teleop_twist_keyboard.py cmd_vel:=/dogbot/cmd_vel

Hit K to make DogBot stop, and to lower the speed, hit Z on the keyboard until when hitting I ( go forwards ) has a sable step. Around 0.1 is ok.

A video version of this post

So this is the post for today. If you prefer, we also have a video version of this post available in the link below. We hope you liked the post and the video. If so, please feel free to subscribe to our channel, share this post and comment on the video section.

We would like to thank React Robotics for this amazing robot and thank you for reading through this post.

Keep pushing your ROS Learning.

Interested in learning more? Try out this course on URDF creation for Gazebo and ROS:

Robot Create with URDF

#dogbot #quadruped #pressuresensors #tactilesensors #Simulation #Gazebo #Robot

ROS2 Tutorials #6: How to create a ROS2 launch file [NEW]

ROS2 Tutorials #6: How to create a ROS2 launch file [NEW]

About

In this post, you will learn how to create a ROS2 launch file.

The problem: we have a ROS2 C++ package but it has no launch file. Yes, we can run the code of this package using ros2 run..., but we also want to be able to run it using ros2 launch...

Let’s get to work right away and solve this problem!

Step 1: Get the package without a launch file

Remember we have a package…get a copy of the ROSject containing this package using this link: http://www.rosject.io/l/bce4ffd/.

If you would like to learn how we created this ROSject and package, please see this post: https://www.theconstruct.ai/ros2-tutorials-create-a-ros2-package-cpp/.

Step 2: Recompile the package

Open the ROSject. This might take a few moments, please be patient.

Once the ROSject opens, fire up a Shell from the Tools menu and run the following commands to recompile the package.

user:~$ source /opt/ros/crystal/setup.bash
ROS_DISTRO was set to 'melodic' before. Please make sure that the environment does not mix paths from different distributions.
user:~$ cd ros2_ws
user:~/ros2_ws$ colcon build --symlink-install
Starting >>> ros2_cpp_pkg
 
Finished <<< ros2_cpp_pkg [21.9s]
 
Summary: 1 package finished [22.0s]
user:~/ros2_ws$ source install/setup.bash # source the workspace
ROS_DISTRO was set to 'crystal' before. Please make sure that the environment does not mix paths from different distributions. 
ROS_DISTRO was set to 'melodic' before. Please make sure that the environment does not mix paths from different distributions.
user:~/ros2_ws$

Step 3: Create a ROS2 launch file for this package

Create a launch directory within the package and create a launch file within the directory.

On the Shell, run:

user:~/ros2_ws$ cd src/ros2_cpp_pkg/
user:~/ros2_ws/src/ros2_cpp_pkg$ mkdir launch
user:~/ros2_ws/src/ros2_cpp_pkg$ cd launch && touch ros2_cpp_code.launch.py
user:~/ros2_ws/src/ros2_cpp_pkg$

Fire up an IDE from the Tools menu, find the launch file just created and paste the following code into it:

"""Launch the cpp_code executable in this package"""

from launch import LaunchDescription
import launch_ros.actions


def generate_launch_description():
    return LaunchDescription([
        launch_ros.actions.Node(
            # the name of the executable is set in CMakeLists.txt, towards the end of
            # the file, in add_executable(...) and the directives following it
            package='ros2_cpp_pkg', node_executable='cpp_code', output='screen'),
    ])

Now, we need to tell ROS to recognize the launch file. Open up  CMakeLists.txt in the IDE and add the following lines at the bottom of it.

# install the launch directory
install(DIRECTORY
  launch
  DESTINATION share/${PROJECT_NAME}/
)

Don’t forget to save all the changes!!

Step 4: Compile and source the workspace again.

Again? Yes, again! We need to.

user:~/ros2_ws/src/ros2_cpp_pkg/launch$ cd /home/user/ros2_ws
user:~/ros2_ws$ colcon build --symlink-install
Starting >>> ros2_cpp_pkg
Finished <<< ros2_cpp_pkg [3.35s]

Summary: 1 package finished [3.48s]
user:~/ros2_ws$ source install/setup.bash # source the workspace
user:~/ros2_ws$

Step 5: Launch the package using the new launch file and be happy!

You’ve been working very hard, time to eat the fruit of your labor!

The format for the command is ros2 launch <package_name> <launch_file_name>.

We are launching the C++ code in src/ros2_cpp_code.cpp, and the output should be something like this:

user:~/ros2_ws$ ros2 launch ros2_cpp_pkg ros2_cpp_code.launch.py
[INFO] [launch]: process[cpp_code-1]: started with pid [13071]
[INFO] [ObiWan]: Help me Obi-Wan Kenobi, you're my only hope
[INFO] [launch]: process[cpp_code-1]: process has finished cleanly
user:~/ros2_ws$

Sweet! Done!!

Extra 1: ROSject link

Get the ROSject containing all code used in the post in the following link: http://www.rosject.io/l/bcf8985/.

Extra 2: ROS2 Full Course for Beginners

Extra 3: Video

Prefer to watch a video demonstrating the steps above? We have one for you below!

Related Resources

ROS2 Basics

Feedback

Did you like this post? Do you have questions about what is explained? Whatever the case, please leave a comment on the comments section below, so we can interact and learn from each other.

If you want to learn about other ROS topics, please let us know in the comments area and we will do a video/post about it 🙂

Pin It on Pinterest