Close

Fixing Functional Problems in URDF

A project log for Phoebe TurtleBot

DIY variant of ROS TurtleBot for <$250 capable of simultaneous location and mapping (SLAM)

rogerRoger 10/09/2018 at 21:364 Comments

Once I had a decent looking URDF for Phoebe up and running, I added it into the Phoebe launch files and started working on the problems exposed by putting it to work.

The first problems were the drive wheels. Visually, they were stuck at the origin and didn’t move with the rest of the robot. Looking through error messages I realized ROS had expected me to read wheel encoder values and publish them as joint state. Since I hadn’t done so, this meant the wheels (attached with “continuous” joint) didn’t know their location. Until I get around to processing wheel encoder values, the joint type was changed to “fixed” to attach them to the chassis.

Looking at the model from multiple angles, I realized I forgot the caster wheel. Since it’s not driven, it is represented as a simple sphere and also attached via a fixed joint.

That’s enough to start driving around as a single unit, but the robot movement in RViz was reversed front/back with LIDAR data plot. This was caused by the fact I forgot to tell ROS the LIDAR is pointed backwards on the robot. Once I had done so, the 180 degree yaw is visible on the object axis visualization: The LIDAR’s X-axis (red cylinder) is pointing backwards instead of forwards like all the other axis.

The final set of changes might be more cosmetic than functional. When reading about differential drive robots in ROS, it was brought up several times that the robot’s X/Y origin base_link need to be lined up with the pivoting axis of the robot. However, it wasn’t clear where the Z axis is supposed to be. Perhaps this is different for each ROS mapping module? The algorithm hector_slam defined several frames but they don’t appear to be supported by gmapping.

I first defined Phoebe origin as the center point between its two drive wheel axles. When rendered in RViz, this means the Z plane intersects the middle of the robot. It seems to work well, but the visualization looks a bit odd. Intuitively I want the Z plane to represent the ground, so I decided to drop the robot origin to ground level. In the object visualization, this is visible as the purple arrow heads all pointing at a center point below the robot. If I learn this was a bad move later, I’ll have to change it back.

All these changes combined gave me a Phoebe URDF with minimal representation in RViz visualization of Phoebe behavior.

(Cross-posted to NewScrewdriver.com)

Discussions

Humpelstilzchen wrote 10/10/2018 at 18:40 point

Want another tip? http://www.ros.org/reps/rep-0103.html#axis-orientation says that z-axis is pointing up. From the picture it seems you have it correct. You might also want to check rep-105.

The big difference between hector slam and gmapping from the user perspective is that hector is designed so it can work without odometry. The picture in the wiki page demonstrates a typical use case which applies to both slam algorithms.

  Are you sure? yes | no

Roger wrote 10/10/2018 at 19:34 point

Right, those references were super useful when setting up robot coordinate frames in URDF. The alphanumeric soup of "REP-0103" and "REP-0105" severely understate how important those documents are to ROS component interoperability. They should really be up front and center in ROS tutorials as required reading.

I read the research paper linked from the hector slam page. It explained the algorithm can work without odometry by "leveraging the high update rate of modern LIDAR systems". One of the example robots in the paper used a Hokuyo UTM-30LX, which spins at 2400 rpm to deliver scans at 40Hz. My cheap Neato LIDAR is 1/10th as fast, 240rpm for 4Hz scans. I'm not sure hector slam will be happy with that (lack of) speed, what do you think?

  Are you sure? yes | no

Humpelstilzchen wrote 10/10/2018 at 19:52 point

I have tested the hector mapping with rplidar a1/a2 @ ~5.5Hz by walking with the lidar in my hand. The algorithm worked ok for slow translations (~20cm/s) but got lost quite easy on rotations. But I say its worth a try it can be setup very quick since only rviz, hector_mapping and the lidar node is needed. The launch file I used is https://gist.github.com/nxdefiant/bc5af0fd97fa4a44ea3c6ea6a3ba44ed just swap the rplidar node with your neato node and you should be able to run the test.

  Are you sure? yes | no

Humpelstilzchen wrote 10/15/2018 at 16:01 point

Update: I rechecked hector_mapping with the rplidar a2 on express setting (12Hz/4000 samples/s) with good result.

  Are you sure? yes | no