# Phoebe The Cartographer

A project log for Phoebe TurtleBot

DIY variant of ROS TurtleBot for <\$250 capable of simultaneous location and mapping (SLAM)

Roger 09/20/2018 at 21:192 Comments

Once odometry calculation math in the Roboclaw ROS driver was fixed, I could drive Phoebe around the house and watch laser and odometry data plotted in RViz. It is exciting to see the data stream starting to resemble that of a real functioning autonomous robot! And just like all real robots… the real world does not match the ideal world. Our specific problem of the day is odometry drift: Phoebe’s wheel encoders are not perfectly accurate. Whether from wheel slippage, small debris on the ground, or whatever else, they cause the reported distance to be slightly different from actual distance traveled. These small errors accumulate over time, so the position calculated from odometry becomes less and less accurate as Phoebe drives.

The solution to odometry drift is to supplement encoder data with other sensors, using additional information can help correct for position drift. In the case of Phoebe and her  TurtleBot 3 inspiration, that comes in courtesy of the scanning LIDAR. If Phoebe can track LIDAR readings over time and build up a map, that information can also be used to locate Phoebe on the map. This class of algorithms is called SLAM for Simultaneous Location and Mapping. And because they’re fundamentally similar robots, it would be straightforward to translate TurtleBot 3’s SLAM demo to my Phoebe.

There are several different SLAM implementations available as ROS modules. I’ll start with Gmapping because that’s what TurtleBot 3 demo used. As input this module needs LIDAR data in the form of ROS topic `/scan` and also the transform tree published via `/tf`, where it finds the geometry relationship between odometry (which I just fixed), base, and laser. As output, gmapping will generate an “occupancy grid”, a big table representing a robot’s environment in terms of open space, obstacle, or unknown. And most importantly for our purposes: it will generate a transform mapping `map` coordinate frame to the `odom`coordinate frame. This coordinate transform is the correction factor to be applied on top of odometry-calculated position, generated by comparing LIDAR data to the map.

Once all the pieces are in place, Phoebe can start mapping out its environment and also correct for small errors in odometry position as it drifts.

SLAM achievement: Unlocked!

(Cross-posted to NewScrewdriver.com)

## Discussions

Humpelstilzchen wrote 09/22/2018 at 13:47 point

Its always good see another hobbyist building a ROS robot. From my experience you definitely nailed the sensor precision vs gmapping delta resolution in your last post, however in this one I would like to be a bit pedantic: The gmapping node doesn't listen to the odom topic, but instead to the /tf-Transform.

The gmapping result looks fairly good btw.

Are you sure? yes | no

Roger wrote 09/22/2018 at 23:22 point

I double-checked the gmapping node documentation (http://wiki.ros.org/gmapping) and you're absolutely right! Thanks for pointing out my beginner mistake, I'll edit my post accordingly.

Are you sure? yes | no