In this post, you will learn how to create a simple ROS2 package for C++. You don’t need a ROS2 installation for this, as we will use the ROS Development Studio (ROSDS), an online platform that provides access to ROS (1 or 2) computers and other powerful ROS tools within a browser!
PS: If you have ROS2 installed locally on your machine, you can skip Step 1.
Head to http://rosds.online and create a project with a similar configuration as the one shown below. You can change the details as you like, but please ensure you select “Ubuntu 18.04 + ROS2 Crystal” under “Configuration”.
Once done with that, open your ROSject. This might take a few moments, please be patient.
Step 2: Source the ROS2 workspace
Once the ROSject is open, head to the Tools menu and pick the Shell tool (if on your local PC just fire up a terminal) and run the following command to source the workspace:
user:~$ source /opt/ros/crystal/setup.bash
ROS_DISTRO was set to 'melodic' before. Please make sure that the environment does not mix paths from different distributions.
user:~$
If you get that ROS_DISTRO warning, just ignore it.
Step 3: Create a ROS2 package in your ROS2 workspace
The syntax for creating a ROS2 package is ros2 pkg create <package_name> --build-type <build_type> --dependencies <dependencies_separated_by_single_space>.
In the same terminal as in Step 2, change to your ROS2 workspace src directory and create a package there:
Create a file named ros2_cpp_code.cpp inside the new package:
user:~/ros2_ws/src$ cd ros2_cpp_pkg/src
user:~/ros2_ws/src/ros2_cpp_pkg/src$ touch ros2_cpp_code.cpp
user:~/ros2_ws/src/ros2_cpp_pkg/src$
Pick the IDE from the Tools menu, locate the C++ file and paste in the following code:
#include "rclcpp/rclcpp.hpp"
int main(int argc, char *argv[]) {
rclcpp::init(argc, argv);
auto node = rclcpp::Node::make_shared("ObiWan");
RCLCPP_INFO(node->get_logger(),
"Help me Obi-Wan Kenobi, you're my only hope");
rclcpp::shutdown();
return 0;
}
In the IDE, open up the CMakeLists.txt and paste the following line at the bottom of the file. We are doing this to ensure the C++ file will be properly detected and compiled.
Now your package is ready to compile…so compile it and source the workspace!
user:~/ros2_ws/src/ros2_cpp_pkg/src$ cd /home/user/ros2_ws
user:~/ros2_ws$ colcon build --symlink-install
Starting >>> ros2_cpp_pkg
Finished <<< ros2_cpp_pkg [21.9s]
Summary: 1 package finished [22.0s]
user:~/ros2_ws$ source install/setup.bash # source the workspace
ROS_DISTRO was set to 'crystal' before. Please make sure that the environment does not mix paths from different distributions. ROS_DISTRO was set to 'melodic' before. Please make sure that the environment does not mix paths from different distributions.
user:~/ros2_ws$
Great. Now test-run it. You should have the string mentioned in the C++ code printed to the terminal output:
user:~/ros2_ws$ ros2 run ros2_cpp_pkg cpp_code
[INFO] [ObiWan]: Help me Obi-Wan Kenobi, you're my only hope
user:~/ros2_ws$
Done!
Extra 1: ROSject link
Get the ROSject containing all code used in the post in the following link: http://www.rosject.io/l/bce4ffd/!
Extra 2: Video
Prefer to watch a video demonstrating the steps above? We have one for you below!
Did you like this post? Do you have questions about what is explained? Whatever the case, please leave a comment on the comments section below, so we can interact and learn from each other.
If you want to learn about other ROS topics, please let us know in the comments area and we will do a video or post about it 🙂
Following up our previous post about how to control a Robot Arm with ROS, today we are going to learn how to set up a MoveIt package for controlling the arm.
In short, in this post we will learn the following:
How to create a basic MoveIt package for your own robot
Test that package moving and planning with moveit.
Special Thanks to Clarkson University and specially James Carrol and its team for lending us the physical Open Manipulator robot.
Getting the code
We are going to create the MoveIt package which uses some URDF files. In order to get the URDF files, please get the following ROSject: http://www.rosject.io/l/b368f5c/
You can open it by clicking on the Open button or you can download it by clicking on the download button, the ones pointed below by the red and green arrows respectively.
The ROSJect mentioned here basically contains the following git repositories on it:
When you open a ROSject, by default you have the Jupyter Notebook automatically open, but if that doesn’t happen, you can manually open it by clicking on Tools -> Jupyter Notebook as shown in the image below:
Now on the Jupyter Notebook window let’s click openmanipulator_morpheus_chair_notebooks, then click on Ep3_MoveIt!_First_Steps.ipynb to open it. That notebook contains the instructions we are going to follow in this post.
The URDF file we are going to use
If you opened or downloaded the ROSject you will see that we have a package called open_manipulator_support_description with the .xacro file called open_manipulator_support.urdf.xacro. If you opened the ROSject on n ROSDS (ROS Development Studio), the full path to that file is: ~/simulation_ws/src/open_manipulator_tc/open_manipulator_support_description/urdf
The content of open_manipulator_support.urdf.xacro is:
Now that we know which urdf/xacro file we are going to use, let’s create a MoveIt package. On the ROSject that we are using that package is already created for you and it’s available on the path /home/user/catkin_ws/src/openmanipulator_ep2_movit_config, but we are going to show how we created it.
In order to open the MoveIt Assistant, the program used to create our first MoveIt Package, we used the following commands:
cd ~/catkin_ws/src
roslaunch moveit_setup_assistant setup_assistant.launch
MoveIt Assistant is a graphical application, so, after running the command aforementioned, on ROSDS you have to open the Graphical Tools by clicking Tools -> Graphical Tools as shown below:
After clicking on that button we should see MoveIt Setup Assistant as in the image below:
If you want the Graphical Tools to be opened in a different tab of the web browser, you can simply click on the Detach button as shown below:
Creating the MoveIt package from scratch
With the MoveIt Assistant open, let’s click on the Create New MoveIt Configuration Package button as shown in the image below:
Now we need to select our URDF file. For that, let’s click on the Browse button under the section Load a URDF or Collada Robot Model and select our URDF file mentioned earlier, which is on the path:
then let’s click Load Files as in the image below.
Now you should have the URDF loaded. Make sure you can see it properly as in the right side of the image below:
Generate Collision Matrix
The first thing that is vital for a robot arm when it moves, is to NOT HIT ITSELF. Which seems dumb but, it is really a common way to break a thousand-euros robot arm if it doesn’t have the correct safety features like peak torque detection or some kind of external perception.
We have to generate what is called Self-Collision Matrix. To do it we use the Self-Collision Matrix Generator. We need to generate this matrix because:
Doing this we detect which links will collide with each other when moving.
We will detect also the links which will never collide with each other, which then we can remove the auto collisions calculations for them, lowering the burden in the processing.
Detect which links will be always in collision and therefore we suppose that its normal and therefore we also disable the calculations.
Disable also the links adjacent in the kinematic chain which obviously we will disable also their auto collisions.
If you look on the left side image above, we are at the Start section. Let’s click on the Self-Collisions button that appears on the left and then click Generate Collision Matrix.
The buttons we clicked are shown below:
The Sampling Density value of 10,000 collision means how many random robot positions to check for self-collision. The higher the better collision detection matrix it generates, but we will need more time and processing power in parallel.
The Min Collisions of 95% means that for considering that the pair is always colliding has to be 95% of all the random positions tested.
After clicking Generate Collision Matrix we have the matrix generated as in the image below.
The links are listed in a Linear View. Let’s select the Matrix View because that view allows a better understanding.
The green parts of the image above means “Never Collide”.
Note that because the Collision Matrix is random, it could be that every time you generate a new matrix, the matrix appears a bit different:
Virtual Joints
A robot that moves around a plane is specified using a planar virtual joint that connects the world frame to the frame of the robot. A robot that doesn’t move, will use a fixed joint to connect the base frame to the world.
In our case we select:
Name: virtual_joint ( just to know its a virtual joint )
Parent: world
Child: the base_link, that we want to connect to world.
Joint Type: Fixed, because we won’t move.
You can play with this because, what if we select as a parent a link of another robot, like a turtlebot that moves around? These things we will go deeper when the time comes. DONT FORGET TO HIT SAVE.
Planning Groups
Doing Inverse Kinematics is computationally very intensive. This means that the simpler the kinematics to solve the better. That’s why normally we divide a robot into the maximum parts that allow a correct and easy IK calculation. Ex: a robot with TWO arms, normally will be divided into LEFT_ARM and RIGHT_ARM, because we don’t need to solve inverse kinematics for both, but it all depends on the use case.
We will choose the following:
Solver: kdl_kinematics_plugin/KDLKinematicsPlugin as the kinematics solver. This is the plugin in charge of calculating the inverse kinematics. It’s a generic Kinematic solver, that will be ok for now. It is the default one in Moveit!. It only works with Serial Kinematics Chains ( tensegrities and things like that won’t work).
Name: openmanipulator_arm seems appropriate.
Kin Solv. Attempts: 3 seems reasonable.
Planner: for now we leave this none
We add the joins!
In our case, we select ALL the Joints except the gripper.
Now we add the gripper through joints also:
joint8
joint9
id_6
NOTE that we are NOT adding anywhere the LINK7 that’s because it serves no purpose in the planning for the moment.
We now can store premade robot poses to set the robot in safe positions, calibration, very used positions, etc.
We added four poses:
Two for the OpenManipulator_ARM group
Two for the Gripper group.
End Effector:
We can add now the gripper as end-effector. This unblocks some functionality related exclusively with end effectors:
We set:
Name: gripper
Group ENDEffector: gripper
Parent Link: Link5
Parent Group: Not necessary here to state it.
Things we won’t set:
These are elements that we might enable after, but for now, we leave them unset because we don’t need them:
Passive Joints: These are for caster wheels and other arent actuated.
3D perception: We don’t have for the moment any sensors (We mey add it afterwards).
Simulation: When executed it tells us there is nothing to change.
ROS controllers:
Here we have two options:
Add the controllers manually: This allows us to select which type of control we are adding.
Auto-add FollowTrajectoryControll
For the moment we will AutoGenerate the FollowJointTrajectory Control. If we need to change it we just have to re-edit the Moveit package.
Author: Add the Author info
FINALLY, generate the package
Now is the time to generate the package that we will use:
Select the location. Hit browse and create a new folder with the name of the package you want. In our case, we will call it open_manipulator_morpheuschair_movit_config.
So, that is the post for today. We hope you guys enjoyed it. Remember that we also have a video showing everything that is in this post. Please have a look at the video if you didn’t understand some of the things that were explained here. If you liked the content, please subscribe to our channel and leave your comments on the comments section of the video, which is available at:
Welcome to Morpheus Chair, the series where you learn how to build and control real robots like the Open Manipulator that we are going to use today.
In this post, we are going to learn the first steps to use the servos used to control a robot arm, so that you can build whatever you want.
We are going to see how to use move the arm with python scripts, which will be the first steps to get a bit close to MoveIt, that great library that we are going to use in the next posts.
How to start a position control for Dynamixel Servos
How to move the joints publishing in Topics and Services of ROS
How to capture complex movements to after reproducing them.
Special Thanks to Clarkson University and especially James Carrol and its team for lending us the real robot used in this post.
Let’s get started, shall we?
Where to find what we need
Before we start, is worth mentioning that we need to be careful when working with the robot, because the servo motors are a bit strong. If you get your fingers or face in the trajectory of the arm you can get hurt. Just be careful.
You can download the ROSject and run all the commands in your own computer, but if you prefer, you can also run some of the commands online by using ROS Development Studio (ROSDS).
By clicking on http://www.rosject.io/l/b368f5c/ you will have a copy of the ROSject. You should see something like the image below:
You can now open the ROSject by clicking on the Open button or you can download it by clicking on the download button, the ones pointed below by the red and green arrows respectively.
The ROSJect mentioned here basically contains the following git repositories on it:
When you open a ROSject, by default you have the Jupyter Notebook automatically open, but if that doesn’t happen, you can manually open it by clicking on Tools -> Jupyter Notebook as shown in the image below:
Now on the Jupyter Notebook window let’s click openmanipulator_morpheus_chair_notebooks, then click on Ep2_Dynamixel_Control.ipynb to open it. That notebook contains the instructions we are going to follow in this post.
Once again, be careful when working with real robots because the servo motors used today, for example, they don’t have force feedback, so if somehow you get a finger stuck there it can be really harmful.
Connecting our robot to our local computer
We are assuming you already have the robot. Now let’s connect it to our computer:
once the USB cable is connected, let’s see that our device is connected. If we run the command below:
dmesg -wH
and disconnect our USB cable we will see a message saying so.
[ +16,678853] usb 3-1: USB disconnect, device number 9
[ +0,000491] ftdi_sio ttyUSB0: FTDI USB Serial Device converter now disconnected from ttyUSB0
[ +0,000059] ftdi_sio 3-1:1.0: device disconnected
If we now connect our robot again we should see a message saying confirming that.
[ +7,356777] usb 3-1: new high-speed USB device number 9 using xhci_hcd
[ +0,134790] usb 3-1: New USB device found, idVendor=0403, idProduct=6014
[ +0,000011] usb 3-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[ +0,000006] usb 3-1: Product: USB <-> Serial Converter
[ +0,000005] usb 3-1: Manufacturer: FTDI
[ +0,000004] usb 3-1: SerialNumber: FT2H2ZXW
[ +0,003952] ftdi_sio 3-1:1.0: FTDI USB Serial Device converter detected
[ +0,000086] usb 3-1: Detected FT232H
[ +0,000876] usb 3-1: FTDI USB Serial Device converter now attached to ttyUSB0
Note that our device name is called ttyUSB0 according to the logs. The name is important because we have to use it when setting up our program.
Another way of getting more information about the connected device (robot) is with the command below:
lsusb
which will give us the following output when our robot is not connected.
$ lsusb
Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 006: ID 0cf3:3005 Atheros Communications, Inc. AR3011 Bluetooth
Bus 001 Device 004: ID 04f2:b2cf Chicony Electronics Co., Ltd
Bus 001 Device 003: ID 04d9:a067 Holtek Semiconductor, Inc.
Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
If we now connect the robot again, we can see it now:
$ lsusb
Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
**Bus 003 Device 010: ID 0403:6014 Future Technology Devices International, Ltd FT232H Single HS USB-UART/FIFO IC**
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 006: ID 0cf3:3005 Atheros Communications, Inc. AR3011 Bluetooth
Bus 001 Device 004: ID 04f2:b2cf Chicony Electronics Co., Ltd
Bus 001 Device 003: ID 04d9:a067 Holtek Semiconductor, Inc.
Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
By comparing the messages when our device is connected with the messages when it is not connected, we can clearly see that our robot is the Future Technology Devices International, Ltd FT232H Single HS USB-UART/FIFO IC entry.
Running the controllers
If you downloaded the ROSject and extracted on your home folder (/home/your_username), you should have a simulation_ws folder there. In that folder, we will find a package called dynamixel_workbench_controllers, more precisely on the ~/simulation_ws/src/open_manipulator_tc/dynamixel-workbench/dynamixel_workbench_co ntrollers path.
On that package, you can find a launch file called position_control.launch, which contains the following content.
Note that on the second line we have the name of our device /dev/ttyUSB0. If in your computer the device has a different name, please update that line with the correct value. You can use the command dmesg -wH to see what is your device’s name.
On the third line, we have the baud_rate which in our case is 3000000. To get the baud_rate or our device we just run the command below with the robot connected:
stty < /dev/ttyUSB0
which in our case showed that our speed baud is 3000000.
If everything goes ok, the robot now should be stiff. You shouldn’t be able to move it.
If after running the roslaunch command you have an error like the one below, that happens because you may have a different baud rate. If that is the case, please update the device_name and baud_rate variables on the launch file with the correct values.
[ERROR] [1562837899.024503906]: Not found Motors, Please check scan range or baud rate
================================================================================
REQUIRED process [position_control-2] has died!
process has died [pid 14723, exit code -11, cmd /home/rdaneel/ros_playground/open_manipulator_ws/devel/lib/dynamixel_workbench_controllers/position_control __name:=position_control __log:=/home/rdaneel/.ros/log/9c5dd618-a3bf-11e9-ba51-9cb70d2d9370/position_control-2.log].
log file: /home/rdaneel/.ros/log/9c5dd618-a3bf-11e9-ba51-9cb70d2d9370/position_control-2*.log
Initiating shutdown!
================================================================================
If you open a new bash terminal and run rostopic list you should see at least the topics below:
$ rostopic list
/dynamixel_state
/goal_dynamixel_position
/joint_states
/rosout
/rosout_agg
We can also check the services we have. With rosservice list we should have at least the services below:
If you followed the instructions correctly, running the commands rostopic list and rosservice list on ROSDS would give you the same output of when we ran on our own computer in which the robot is connected to.
Diving deeper
With rostopic list we can see that we have basically three topics:
There are the basic topics that should always exist with any robot if you use the controller we are using here.
If we want to know more information about a specific topic, like /dynamixel_state for example, we can just run rostopic info /dynamixel_state. In this command we will see the list of publishers, subscribers and the message type, which for this topic is:
$ rostopic info /dynamixel_state
Type: dynamixel_workbench_msgs/DynamixelStateList
With rostopic echo /dynamixel_state -n1 we can see the model_name of each servo motor.
With rostopic echo /dynamixel_state -n1 we saw the present_position of each joint in a raw way. By printing the messages published on the /joint_states topic we can see the positions in radians:
We can see that it has only subscribers, not publishers. That happens because the robot is listening to that topic and nobody is telling the robot to move yet. Let’s publish our first message there so that our arm can move.
Moving the arm
In order to move the arm, we have to publish a message on the /goal_dynamixel_position topic. For that we can use the command below:
We will fill the name, position, velocity, and effort with the same data we got from rostopic echo /joint_states -n1. We are just going to change the value of the first joint to 0.7. At the end our message will be something like:
Bean in mind that unit can be ‘rad‘, ‘raw’ or nothing. If we select raw or nothing, the effect is the same.
Creating a python ROS Node to move the servos
To move the arm with python code we have two options:
We move it with a Service Client ( /joint_command ): We can only move one joint at a time, but it waits until movement finished.
We move it publishing in a Topic ( /goal_dynamixel_position ): Moves everything at the same time, but there is no wait.
We are going to create a python class that allows us to use both. If you have the ROSJect, the ROS Package is already created for you at ~/simulation_ws/src/openmanipulator_morpheus_chair/openmanipulator_morpheus_chair_tutorials. We created that package with the commands below:
cd ~/simulation_ws
source devel/setup.bash
rospack profile
cd src
catkin_create_pkg openmanipulator_morpheus_chair_tutorials rospy dynamixel_workbench_msgs sensor_msgs
cd openmanipulator_morpheus_chair_tutorials
mkdir scripts
touch scripts/move_openmanipulator.py
chmod +x scripts/move_openmanipulator.py
On the move_openmanipulator.py file we have the following content:
#! /usr/bin/env python
import time
import rospy
from sensor_msgs.msg import JointState
from dynamixel_workbench_msgs.srv import JointCommand, JointCommandRequest
from std_msgs.msg import Header
class OpenManipulatorMove(object):
def __init__(self):
rospy.loginfo("OpenManipulatorMove INIT...Please wait.")
# We subscribe to the joint states to have info of the system
self.joint_states_topic_name = '/joint_states'
self._check_join_states_ready()
sub = rospy.Subscriber(self.joint_states_topic_name, JointState, self.joint_states_callback)
# We start the Publisher for the positions of the joints
self.goal_dynamixel_position_publisher = rospy.Publisher('/goal_dynamixel_position',
JointState,
queue_size=1)
# Wait for the service client /joint_command to be running
joint_command_service_name = "/joint_command"
rospy.wait_for_service(joint_command_service_name)
# Create the connection to the service
self.joint_command_service = rospy.ServiceProxy(joint_command_service_name, JointCommand)
rospy.loginfo("OpenManipulatorMove Ready!")
def joint_states_callback(self,msg):
"""
rosmsg show sensor_msgs/JointState
std_msgs/Header header
uint32 seq
time stamp
string frame_id
string[] name
float64[] position
float64[] velocity
float64[] effort
:param msg:
:return:
"""
self.joint_states_msg = msg
def _check_join_states_ready(self):
self.joint_states_msg = None
rospy.logdebug("Waiting for "+self.joint_states_topic_name+" to be READY...")
while self.joint_states_msg is None and not rospy.is_shutdown():
try:
self.joint_states_msg = rospy.wait_for_message(self.joint_states_topic_name, JointState, timeout=5.0)
rospy.logdebug("Current "+self.joint_states_topic_name+" READY=>")
except:
rospy.logerr("Current "+self.joint_states_topic_name+" not ready yet, retrying ")
def move_all_joints(self, position_array):
rospy.logwarn("move_all_joints STARTED")
# We check that the position array has the correct number of elements
number_of_joints = len(self.joint_states_msg.name)
if len(position_array) == number_of_joints:
if self.check_gripper_pos_safe(position_array[6]):
new_joint_position = JointState()
h = Header()
h.stamp = rospy.Time.now() # Note you need to call rospy.init_node() before this will work
h.frame_id = self.joint_states_msg.header.frame_id
new_joint_position.header = h
new_joint_position.name = self.joint_states_msg.name
new_joint_position.position = position_array
# These values arent used, so they dont matter really
new_joint_position.velocity = self.joint_states_msg.velocity
new_joint_position.effort = self.joint_states_msg.effort
rospy.logwarn("PUBLISH STARTED")
self.goal_dynamixel_position_publisher.publish(new_joint_position)
rospy.logwarn("PUBLISH FINISHED")
else:
rospy.logerr("Gripper position NOT valid=" + str(position_array[6]))
else:
rospy.logerr("The Array given doesnt have the correct length="+str(number_of_joints))
rospy.logwarn("move_all_joints FINISHED")
def move_one_joint(self, joint_id, position, unit="rad"):
"""
rossrv show dynamixel_workbench_msgs/JointCommand
string unit
uint8 id
float32 goal_position
---
bool result
:param joint_id:
:param position:
:param units:
:return:
"""
joint_cmd_req = JointCommandRequest()
joint_cmd_req.unit = unit
joint_cmd_req.id = joint_id
joint_cmd_req.goal_position = position
if joint_id == 7:
rospy.logwarn("CHECKING Gripper Value is safe?")
if self.check_gripper_pos_safe(position):
# Send through the connection the name of the object to be deleted by the service
result = self.joint_command_service(joint_cmd_req)
rospy.logwarn("move_one_joint went ok?="+str(result))
else:
rospy.logwarn("Gripper Value Not safe=" + str(position))
else:
# Send through the connection the name of the object to be deleted by the service
result = self.joint_command_service(joint_cmd_req)
rospy.logwarn("move_one_joint went ok?=" + str(result))
def get_joint_names(self):
return self.joint_states_msg.name
def check_gripper_pos_safe(self, gripper_value):
"""
We need to check that the gripper pos is -1.0 > position[6] > -3.14
Otherwise it gets jammed
:param gripper_value:
:return:
"""
return (-0.5 > gripper_value > -2.0)
def movement_sequence_test():
openman_obj = OpenManipulatorMove()
# NOD
joint_position_home = [0.08743690699338913, 1.0385050773620605, -2.345456600189209, -0.016873789951205254,
-1.4818254709243774, 0.0015339808305725455, -1.0599807500839233]
joint_position1 = [0.8897088766098022, 0.6059224009513855, -1.4419419765472412, -0.016873789951205254,
-1.4818254709243774, 0.0015339808305725455, -1.0599807500839233]
joint_position2 = [0.8912428617477417, 0.5859806537628174, -1.6060779094696045, -0.016873789951205254,
-0.8191457390785217, 0.004601942375302315, -1.0599807500839233]
joint_position3 = [0.8897088766098022, 0.6028544902801514, -1.8745245933532715, -0.015339808538556099,
0.5292233824729919, 0.003067961661145091, -1.0599807500839233]
# SAY NO
joint_left = [0.44332045316696167, 1.0630487203598022, -2.345456600189209, 0.5568350553512573, -1.483359456062317,
0.004601942375302315, -1.0599807500839233]
joint_right = [-0.20862139761447906, 1.0906603336334229, -2.3071072101593018, -0.6488738656044006,
-1.483359456062317, -0.4417864680290222, -1.0599807500839233]
joint_middle = [0.0076699042692780495, 1.1274758577346802, -2.325515031814575, 0.3344078063964844,
-1.4848934412002563, 0.46172821521759033, -1.0599807500839233]
# Pendulum
pend_left = [0.46479618549346924, 0.13345633447170258, -1.728796362876892, 1.5907381772994995, -1.6797089576721191, 0.004601942375302315, -1.0799225568771362]
pend_middle = [0.39269909262657166, 0.1595340073108673, -2.0984857082366943, -0.09817477315664291, -1.0615147352218628, -0.0015339808305725455, -1.0799225568771362]
pend_right = [0.006135923322290182, 0.42337870597839355, -1.8806605339050293, -1.306951642036438, -1.0661166906356812, -0.004601942375302315, -1.0799225568771362]
joint_position_sequence_nod = []
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_nod.append(joint_position1)
joint_position_sequence_nod.append(joint_position2)
joint_position_sequence_nod.append(joint_position3)
joint_position_sequence_nod.append(joint_position2)
joint_position_sequence_nod.append(joint_position3)
joint_position_sequence_nod.append(joint_position1)
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_say_no = []
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_middle)
joint_position_sequence_nod.append(joint_right)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_middle)
joint_position_sequence_nod.append(joint_right)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_right)
joint_position_sequence_nod.append(joint_left)
joint_position_sequence_nod.append(joint_position_home)
joint_position_sequence_say_pendulum = []
joint_position_sequence_say_pendulum.append(joint_position_home)
joint_position_sequence_say_pendulum.append(pend_left)
joint_position_sequence_say_pendulum.append(pend_middle)
joint_position_sequence_say_pendulum.append(pend_right)
joint_position_sequence_say_pendulum.append(pend_left)
joint_position_sequence_say_pendulum.append(pend_middle)
joint_position_sequence_say_pendulum.append(pend_right)
joint_position_sequence_say_pendulum.append(pend_left)
joint_position_sequence_say_pendulum.append(pend_middle)
joint_position_sequence_say_pendulum.append(pend_right)
joint_position_sequence_say_pendulum.append(joint_position_home)
for joint_position_array in joint_position_sequence_nod:
openman_obj.move_all_joints(joint_position_array)
time.sleep(0.5)
for joint_position_array in joint_position_sequence_say_no:
openman_obj.move_all_joints(joint_position_array)
time.sleep(0.2)
for joint_position_array in joint_position_sequence_say_pendulum:
openman_obj.move_all_joints(joint_position_array)
time.sleep(0.5)
def move_joints_test():
"""
This is for Geting the positions of the joints without testing them
live, which is quite dangerous!
:return:
"""
openman_obj = OpenManipulatorMove()
joint_names = openman_obj.get_joint_names()
rospy.logwarn("Starting Moving Joints GUI...")
while not rospy.is_shutdown():
rospy.logwarn("#######"+str(joint_names)+"#####")
joint_id = int(raw_input("Joint ID="))
joint_position = float(raw_input("Joint Position Radians="))
openman_obj.move_one_joint(joint_id, joint_position, unit="rad")
rospy.logwarn("####################")
if __name__ == "__main__":
rospy.init_node('move_openmanipulator_node', log_level=rospy.WARN)
#move_joints_test()
movement_sequence_test()
Now in order to move the arm using our script, we can simply:
If everything went as expected, your arm should have moved to different places.
With the command above we called the servers responsible for moving the arm. Remember that we launched our controller with roslaunch dynamixel_workbench_controllers position_control.launch, which basically runs the code defined on ~/simulation_ws/src/open_manipulator_tc/dynamixel-workbench/dynamixel_workbench_controllers/src/position_control.cpp. Please have a look at that file if you want to understand how the Service Server and the Topic Subscriber move the arms.
Manually extracting the servo positions
At this point, you might be wondering how we know the radian value for the positions we want the arm to move to. We do that by disabling the controllers, moving the arm to the desired position and then checking the joint states, as explained here:
Step1: Position the arm as you wish when the control isn’t on, and leave it there with the help of someone.
Step2: Turn on the control (roslaunch dynamixel_workbench_controllers position_control.launch), now the joints are stiff
Step3: rostopic echo /joint_states/position -n1
And there you will have the exact position of all the joints. You now just have to feed that move_openmanipulator.py.
What can I do when my gripper doesn’t move?!
If you have followed the gripper design that we show here, it’s possible that it jot jammed and it didn’t move anymore. Just power everything OFF, and gently rotate the gripper AXIS actuator to a more open pose. NEVER send a command to the gripper bigger than -0.5 ( CLOSE ), -2.0 ( OPEN ).
Rate doesn’t work properly?
rateobj = rospy.Rate(10.0)
for joint_position_array in joint_position_sequence_nod:
openman_obj.move_all_joints(joint_position_array)
rateobj.sleep()
rateobj2 = rospy.Rate(1.0)
for joint_position_array in joint_position_sequence_say_no:
openman_obj.move_all_joints(joint_position_array)
rateobj2.sleep()
You would think that this would make the first movement go fast and the second go slow. But it seems that the one the counts is the first rateobject. If that happens, then it would be preferable to use time.sleep().
So, that is the post for today. Remember that we have a video showing all the content of this post. Please have a look and leave your thoughts on the comments section of the video. Also, feel free to subscribe to our channel on YouTube because we are publishing ROS-related content every day.
The ROSject contains the code of the following git: WhyDoINeedThis
Start the provided ROSject
The first thing you need to do is to copy the ROSject aforementioned. Click on the link to get an automatic copy.
Launch the ROSject
To launch the copied ROSject, go to your ROSDS account and click on the Open button of the copied rosject (see image below). Remember that the rosject you have to open is called WhyINeedIt.
why_python_Classes_rosject
Once the rosject is open, you must see a screen like this:
python_rosject_open
At this point, you have succeeded in launching the ROSject.
Launch the simulation
Let’s launch a simulation because we want a python class that detects when the robot is upside down. For that, let’s do the following:
Go to Simulations, then with the default empty world on the left, let’s click Choose a robot on the right side, as shown in the image below:
After that, you can type “gurdy” or try to find the Gurdy robot by using the scroll bar. Once the robot is selected, you just click Start Simulation.
You now should see the simulation of the Gurdy robot as in the image below:
Finding the python code
The code used in this post can be found at ~/catkin_ws/src/. You can see the content with:
cd ~/catkin_ws/src/
ls
You can also find the code through the Code Editor. For that, let’s click Tools -> IDE as can be seen on the image below:
Once the IDE is open, we can find all the code that will be used to control the robot inside the catkin_ws folder (catkin workspace).
Detecting when the robot is upside down
In order to detect when the robot is upside down, we have to use the imu topic, which publishes the robot orientation. We can find the topic with the command rostopic list | grep imu :
user:~$ rostopic list | grep imu
/gurdy/imu/data
We see that the topic is named /gurdy/imu/data. We can see what is being published in this topic with rostopic echo -n1 /gurdy/imu/data , as exemplified below:
In the output above, we are more interested in the orientation, because is what tells us whether the robot is upside down or not.
In order to detect when the robot is upside down, we execute the imu_behaviour.py found on the catkin_ws with the command below:
rosrun example_python_classes imu_behaviour.py
By manually turning the robot upside down, we can see in the output of this script that it correctly detects when that happens. In order to move the robot upside down, on gzweb (the simulation interface), you click on the “Rotate Mode“, then you rotate one of the arcs. In the example below, we rotate the green arc.
Gurdy upside down on ROSDS
Moving the robot to the right orientation automatically
When we detect that the robot is upside down, we have to move its legs, so that the robot can move back to the right position. The ROS topics we have to write to are 6 and can be found with rostopic list | grep command .
We have the script called imu_behaviour_2.py which automatically detects when the robot is upside down and publish on the topics listed above to have the robot back to normal position. The content of the file is the following:
The magic happens on the function get_imu_data shown above. Based on the position of the robot it decides which movement is going to be executed. We can execute the script with the command below:
rosrun example_python_classes imu_behaviour_2.py
With the command above running, if we now flip the robot around, it will automatically recover its position.
Optimizing the code
Although the imu_behaviour_2.py does the job it is proposed, everything is in a single file and publishers are created every time the execute_movement_gurdy function is called by the get_imu_data one. In addition to that, if we are working on a big project, it is better to have some people working on detecting when the robot is upside down, other people detecting on flipping the robot, so on and so forth.
To make this possible, it is better to have different python files, each one with its different class. We do that by creating the
GurdyBehaviour class on imu_behaviour_3.py, GurdyImuDataProcessor on imu_data_processor.py and
gurdyJointMover on gurdy_mover.py. With this structure, we could have different people working in each of the classes, having the Single Responsibility Principle.
With the classes split into different files, our code is easier to maintain. Please have a look at the different files to better understand the code and how to interacts with the robot.
You can test each script at a time. The more “optimized” version of our script is found on imu_behaviour_4.py which can be executed with:
rosrun example_python_classes imu_behaviour_4.py
So, that is the post of today. We hope you guys liked it. If so, please share this post with your friends.
Video with the explanation of this post
Remember that we also have a live version of this post, which can be found on the video below:
If you liked the video and the post, please leave your thoughts on the comments section of the video. You can also subscribe to our channel on YouTube and press the bell for a new video every day.
Did you like this video? Do you have questions about what is explained? Whatever the case, please leave a comment on the comments section below, so we can interact and learn from each other.
If you want to learn about other ROS topics, please let us know on the comments area and we will do a video about it 🙂