Phoebe TurtleBot

DIY variant of ROS TurtleBot for <$250 capable of simultaneous location and mapping (SLAM)

Similar projects worth following
The open source Robot Operating System has two standard platforms: the very expensive PR2, and the "low cost" TurtleBot. Sadly, at several thousand dollars, it was only "low cost" relative to the six-figure PR2 . Recent developments have reduced the price of admission to as low as $550 (TurtleBot 3 Burger) but resourceful hackers can build their own for even less. Phoebe is one way to build a TurtleBot variant on a budget, yet still has enough sensors for autonomy.

It is easy to build a very basic robot that runs the open source Robot Operating System: just put a motor controller on a Raspberry Pi and connect them to motors that turn wheels. However, such a basic robot without sensors is unaware of its surroundings. The power of ROS ecosystem is in its vast library of software for intelligent robot behavior, and to tap into that power, a robot needs to provide sensory input those software pieces require.

As the entry-level ROS platform, a TurtleBot offers two streams of sensor data:

  1. Encoders that precisely count wheel rotation. We can estimate a robot's position if we know how far each wheel has traveled. This information is commonly presented under the name "/odom" (short for odometry).
  2. A laser distance scanner (LDS) that continuously spins a LIDAR module to measure distance to obstacles all around the robot. This information is commonly presented under the name "/scan".

Providing such sensor data in addition to motor control are the minimum requirements to access a range of publicly available algorithms, such as those that allow a robot to gain awareness of its surroundings. This robot task of mapping its surroundings, and locating itself within its map, is called simultaneous location and mapping or SLAM.

This project, Phoebe, is a DIY TurtleBot variant that can provide odometry and laser scanning data similar enough to the official TurtleBot that Phoebe can run all the same ROS autonomy modules that a TurtleBot can. If all the parts had to be purchased, they would add up to around $200-$250. However, I chose these components because they were already on hand in my parts bin. For a more detailed breakdown on component cost, see build log entry "Cost to Build Phoebe From Scratch"

This "Parts Bin TurtleBot", acronym "PB TB", is called "Phoebe" following tradition set by R2-D2 ("Artoo")

  • 1 × Raspberry Pi 3 Responsible for running ROS nodes to interface with motors and sensors.
  • 1 × microSD card for Raspberry Pi 3 Main system storage for Raspberry Pi 3.
  • 1 × Battery output DC to 5V DC converter Power source for Raspberry Pi 3. I've had good experience with cheap MP1584 step-down converters from Amazon.
  • 1 × Neato XV-11 (or similar) robot vacuum laser distance scanner Units salvaged from broken vacuums are available on eBay, search for "Neato LIDAR"
  • 1 × Battery output DC to 3V DC converter Neato LIDAR's scanning motor requires 3V to spin at the correct speed. (3.3V is too fast.) Alternatively: use a DC motor speed controller to explicitly control spin speed of motor.

View all 11 components

  • Speedy Phoebe: Swapping Gearbox For 370 Motors

    Roger14 hours ago 0 comments

    The first rough draft chassis for Phoebe worked well enough for me to understand some pretty serious problems with that design. Now that I have a better idea what I’m doing, it’s time to start working on a new chassis incorporating lessons learned. And since all the components will be taken apart anyway, it would be a good time to address another problem: Phoebe’s speed. More precisely, the lack thereof.

    Phoebe’s motor + encoder + gearbox unit was borrowed from the retired parts bin for SGVHAK Rover. Since they were originally purchased to handle steering, priority was on precision and torque rather than speed. It worked well enough for Phoebe to move around, but their slow speed meant it took quite some time to map a room.

    The motor mounts used for Phoebe’s first draft chassis were repurposed from a self-balancing robot toy, which had a similar motor coupled with a different ratio gearbox. That motor was not suitable for ROS work because there was no encoder on the motor, but perhaps we could swap its gearbox with the motor that does have an encoder.

    Identical output shaft and mount

    Here’s the “Before” picture: Self-balancing robot motor + gearbox on the left, former SGVHAK rover steering encoder + motor + gearbox on the right. The reason I was able to use the self balancing robot’s motor mount and wheel is because they both had the same output shaft diameter and mount points. While they have identical diameter, the steering gearbox is noticeably longer than the balancing robot gearbox.

    Both Are 370 Motors

    Both of these motors conform to a generic commodity form factor called “370”. A search for “370 motor” on Alibaba will find many companies making different motors with different options of voltage, speed, etc. All are physically similar in size. Maybe even identical? As for what “370” means… best guess is that it originally referred to overall length of the motor at 370 millimeters. It doesn’t specifically mean anything for remaining motor dimensions, but “370 motor” has probably become a de-facto standard. (Like 608 bearings.)

    After a few screws were removed, both gearboxes were easily disassembled. We can see a few neat things: the plate mounting the gearbox to the motor had multiple holes to accommodate three different patterns. Either these gearboxes were designed to fit on multiple different motors, or some 370 motors are made with different bolt patterns than others.

    Gearboxes Removed

    Fortunately, both motors (one with encoder, one without) seem to have the same bolt pattern. And more importantly – the gear mounted on the motor output shaft seems to be identical as well! I don’t have to pull the gear off one shaft and mount it on another, which is great because that process tends to leave the gear weaker. With identical gears already mounted on the motor output shaft, I can literally bolt on the other gearbox and complete the swap.

    After - Gearboxes swapped

    Voila! The motor with encoder now has a different gear ratio that should allow Phoebe to run a lot faster. The slow one was advertised to be 227:1 ratio. I don’t have specification sheet for the fast gearbox, but turning one shaft and counting revolutions of the other indicates a roughly 20:1 ratio. So theoretically Phoebe’s top speed has been increased ten-fold. Would that be too fast and cause Phoebe to run out of control? Would it be unable to slow to a sufficiently low crawl speed for Phoebe to cautiously explore new worlds? We won’t know until we try!

    (Cross-posted to

  • Phoebe’s Nemesis: Office Chair

    Rogera day ago 3 comments

    A little real-world experience with Phoebe has revealed several problems with my first rough draft chassis design. The second problem on the list is Phoebe’s LIDAR height: It sits too high to detect certain obstacles like an office chair. In the picture below, Phoebe has entered a perilous situation without realizing it. The LIDAR’s height meant Phoebe only sees the chair’s center post, thinking it is a safe distance away, blissfully ignorant of chair legs that have completely blocked its path.

    Phoebe Faces Office Chair

    Here is the RViz plot of the situation, representing Phoebe’s perspective. A red arrow indicates Phoebe’s position and direction. Light gray represents cells in the map occupancy grid thought to be open space, and black are cells that are occupied by an obstacle. Office chair’s center post is represented by two black squares at the tip of the red arrow, and the chair’s legs are absent because Phoebe never saw them with LIDAR.

    Phoebe and Office Chair Post

    This presents obvious problems in collision avoidance as Phoebe can’t avoid chair legs that it can’t see. Mounting height for Phoebe’s LIDAR has to be lowered in order to detect office chair legs.

    Now that I’ve seen this problem firsthand, I realize it would also be an issue for TurtleBot 3 Burger. It has a compact footprint, and its parts are built upwards. This meant it couldn’t see office chair legs, either. But that’s OK as long as the robot is constrained to environments where walls are vertical and tall, like the maze seen in TurtleBot 3 Navigation demo. Phoebe would work well in such constrained environments too, but I’m not interested in constrained environments. I want Phoebe to roam my house.

    Which leads us to Waffle Pi, the other TurtleBot 3 model. it has a larger footprint than the Burger, but it is a squat shape allowing LIDAR to be mounted lower and still have a clear view all around the top of the robot.

    So I need to raise the bottom of Phoebe for ground clearance, and also lower the top for LIDAR mount. If the LIDAR can be low enough to look just over the top of the wheels, that should be good enough to see an office chair’s legs. Will I find a way to fit all of Phoebe’s components into this reduced height range? That’s the challenge at hand.

    (Cross-posted to

  • Phoebe’s Nemesis: Floor Transitions

    Roger2 days ago 0 comments

    Right now Phoebe is running around on a very rough first draft of chassis design. It was put together literally in an afternoon in the interest of time. Just throw the parts together so we can see if the idea will even work. Well, it did! And I’m starting to find faults with the first draft chassis that I want to address on the next version for a much better thought-out design.

    The first major fault is the lack of ground clearance. When I switched my mentality from the rough terrain capable Sawppy rover to a flat ground TurtleBot like Phoebe, I didn’t think the latter would need very much ground clearance at all. As a result, Phoebe’ battery pack hung between the driving wheels and caster, with only a few millimeters of clearance between the bottom of the battery tray and the ground.

    Phoebe Ground Clearance Flat

    If I’m not climbing rocks, I asked myself, why would I need ground clearance?

    Well, I’ve found my answer: my home has rooms with carpet, rooms with linoleum, and rooms with tile. The transition between these surfaces are not completely flat. They’re pretty trivial for a walking human being, but for poor little Phoebe they are huge obstacles. Driving across the doorway from carpet to linoleum would cause Phoebe to get stuck on its battery belly.

    Phoebe Ground Clearance Threshold

    “More ground clearance” is a goal for Phoebe’s next chassis.

    (Cross-posted to

  • ROS Notes: Map Resolution

    Roger4 days ago 0 comments

    Now that I’m driving Phoebe around and mapping my house, I’m starting to get some first-hand experience with robotic mapping. One of the most fascinating bits of revelation concerns map resolution.

    When a robot launches the Gmapping module, one of the parameters (delta) dictates the granularity of the occupancy grid in meters. For example, setting it to 0.05 (the value used in TurtleBot 3 mapping demo) means each square in the grid is 0.05 meters or 5 cm on each side.

    This feels reasonable for a robot that roams around a household. Large objects like walls would be fine, and the smallest common obstacle in a house like table legs can reasonably fill a 5cm x 5cm cell on the occupancy grid. If the grid cells were any larger, it would have trouble properly accounting for chair and table legs.

    Low Resolution Sharp Map 5cm

    So if we make the grid cells smaller, we would get better maps, right?

    It’s actually not that simple.

    The first issue stems from computation load. Increasing resolution drastically increases the amount of memory consumed to track the occupancy grid, and increases computation requirements to keep grid cells updated. The increase in memory consumption is easy to calculate. If we halve the grid granularity from 5cm to 2.5cm, that turns each 5cm square into four 2.5cm squares. Quadrupling the memory requirement for our occupancy grid. Tracking and maintaining this map is a lot more work. In my experience the mapping module has a lot harder time matching LIDAR scan data to the map, causing occasional skips in data processing that ends up reducing map quality.

    The second issue stems from sensor precision. An inexpensive LIDAR like my unit salvaged from a Neato robot vacuum isn’t terribly precise, returning noisy distance readings that varies over time even if the robot and the obstacle are both standing still. When the noise exceeds map granularity, the occupancy grid starts getting “fuzzy”. For example, a solid wall might no longer be a single surface, but several nearby surfaces.

    High Resolution Fuzzy Map 1cm

    As a result of those two factors, arbitrarily increasing the occupancy map resolution can drastically increase the cost without returning a worthwhile improvement. This is a downside to going “too small”, and it was less obvious than the downside of going “too large.” There’s a “just right” point in between that makes the best trade-offs. Finding the right map granularity to match robots and their tasks is going to be an interesting challenge.

    (Cross-posted to

  • Phoebe The Cartographer

    Roger5 days ago 2 comments

    Once odometry calculation math in the Roboclaw ROS driver was fixed, I could drive Phoebe around the house and watch laser and odometry data plotted in RViz. It is exciting to see the data stream starting to resemble that of a real functioning autonomous robot! And just like all real robots… the real world does not match the ideal world. Our specific problem of the day is odometry drift: Phoebe’s wheel encoders are not perfectly accurate. Whether from wheel slippage, small debris on the ground, or whatever else, they cause the reported distance to be slightly different from actual distance traveled. These small errors accumulate over time, so the position calculated from odometry becomes less and less accurate as Phoebe drives.

    The solution to odometry drift is to supplement encoder data with other sensors, using additional information can help correct for position drift. In the case of Phoebe and her  TurtleBot 3 inspiration, that comes in courtesy of the scanning LIDAR. If Phoebe can track LIDAR readings over time and build up a map, that information can also be used to locate Phoebe on the map. This class of algorithms is called SLAM for Simultaneous Location and Mapping. And because they’re fundamentally similar robots, it would be straightforward to translate TurtleBot 3’s SLAM demo to my Phoebe.

    There are several different SLAM implementations available as ROS modules. I’ll start with Gmapping because that’s what TurtleBot 3 demo used. As input this module needs LIDAR data in the form of ROS topic /scan and also the transform tree published via /tf, where it finds the geometry relationship between odometry (which I just fixed), base, and laser. As output, gmapping will generate an “occupancy grid”, a big table representing a robot’s environment in terms of open space, obstacle, or unknown. And most importantly for our purposes: it will generate a transform mapping map coordinate frame to the odomcoordinate frame. This coordinate transform is the correction factor to be applied on top of odometry-calculated position, generated by comparing LIDAR data to the map.

    Once all the pieces are in place, Phoebe can start mapping out its environment and also correct for small errors in odometry position as it drifts.

    SLAM achievement: Unlocked!

    Phoebe GMapping(Cross-posted to

  • Driving Miss Phoebe

    Roger6 days ago 0 comments

    This project page is primarily focused on the hardware, but because Phoebe is a project to explore ROS there's a significant software component as well. After chassis first draft was assembled, I dove into the software side.

    Up on Github is a free ROS node for integrating Roboclaw motor controller module used on Phoebe. I encountered three problems while trying to get Phoebe moving. All three of these fixes have been submitted back to the original repository as pull requests.

    Once Phoebe was moving reliably, I tried to integrate LIDAR data with odometry data generated by Roboclaw from reading motor encoders, but that unveiled another bug with odometry calculation. 

    This fix has also been submitted as a pull request.

    If and when the pull requests are approved, the original repository will pick up those fixes. But until then, people using Roboclaw in ROS and encountering the same issues can clone my fork of it.

  • LIDAR Installation Completes First Draft

    Roger09/14/2018 at 18:09 0 comments

    With the motors connected to Roboclaw, their direction and encoder in sync, and PID values tuned, Phoebe can be driven around via ROS /cmd_vel topic and report its movement via /odom. However, Phoebe has no awareness of its surroundings, which is where the LIDAR module comes in.

    Salvaged from a Neato robot vacuum (and bought off eBay), it is the final major component to be installed on Phoebe. Since this is a rough first draft, the most expedient way to install the device is to drill a few holes for M3 standoffs, and mount the module on top of them. This allows the module clear lines of sight all around the robot, while sitting level with the ground. It is also installed as close to the center of the robot as practical. I don’t know if a center location is critical, but intuitively it seems to be a good thing to have. We’ll know more once we start driving it around and see what it does.

    By this point the rough draft nature of the project is very visible. The LIDAR spin motor sticks out below the module the furthest, and the motor inadvertently sits right on top of the Raspberry Pi’s Ethernet port, which is the tallest point on a Pi. Raising the LIDAR high enough so they don’t collide left a lot of empty space between the two modules. Which is not wasted at the moment, because the wiring mess is getting out of control and could use all the space it can occupy.

    The next version should lay things out differently to make everything neater. In the meantime, it’s time to see if we can drive this robot around and watch its LIDAR plot. And once that basic process has been debugged, that should be everything necessary to enable ROS projects to give Phoebe some level of autonomy.

    Phoebe TurtleBot Stage 3 LIDAR(Cross-posted to

  • Phoebe Receives Raspberry Pi Brain After PID Tuning

    Roger09/13/2018 at 18:11 0 comments

    Once the motor’s spin direction was sorted out, I connected both encoders to verify A/B signals are in sync with motor direction. Again this is checked by commanding motor movement via Ion Studio software and watching the reported encoder value.

    When wired correctly, encoder counter will increase when motor is commanded to spin in the positive direction, and decrease when motor spins negative. If hooked up wrong, the encoder value will decrease when the motor spins positive, and vice versa. The fix is simple: power down the system, and swap the A/B quadrature encoder signal wires.

    Once the motor direction is verified correct, and encoder wires verified to match motor direction, we can proceed to the final phase of Roboclaw setup: determine PID coefficients for motor control.

    PID tuning is something of a black art. Fortunately, while a perfect tune is very difficult to obtain, it’s not that hard to get to “good enough.” Furthermore, Ion Studio features an “Auto Tune” option to automatically find functional PID coefficients. During SGVHAK Rover construction we had no luck getting it to work and resorted to tuning PID coefficients manually. Fortunately, this time around Ion Studio’s automatic PID tuning works. I’m not sure what changed, but I’m not going to complain.

    Once PID coefficients have been written to Roboclaw NVRAM, we no longer need to use the Windows-based Ion Studio software. From here on out, we can use a Raspberry Pi to control our motors. The Pi 3 was mounted so its microSD card remains accessible, as well as its HDMI port and USB ports. This meant trading off access to GPIO pins but we’re not planning to use them just yet so that’s OK.

    Software-wise, the Raspberry Pi 3’s microSD card has a full desktop installation of ROS Kinetic on top of Ubuntu Mate 16.04 compiled for Raspberry Pi. In addition to all the Robotis software for TurtleBot 3, it also has a clone of the ROS control node, as well as a clone of the Neato LIDAR control node.

    The wiring is not very neat or pretty but, again, this is just a rough first draft.

    Phoebe TurtleBot Stage 2 Encoder Pi

    (Cross-posted to

  • Establish Motor Directions

    Roger09/12/2018 at 17:15 0 comments

    The first revision of Phoebe’s body frame has mounting points for the two drive wheels and the caster wheel. There are two larger holes to accommodate drive motor wiring bundle, and four smaller holes to mount a battery tray beneath the frame. Since this is the first rough draft, I didn’t bother spending too much time over thinking further details. We’ll wing it and take notes along the way for the next revision.

    Phoebe Frame First Draft.PNG

    After the wheels were installed, there was much happiness because the top surface of the frame sat level with the ground, indicating the height compensation (for height difference between motorized wheels and caster in the back) was correct or at least close enough.

    Next, two holes were drilled to mechanically mount the Roboclaw motor control module. Once secured, a small battery was connected plus both motor drive power wires. Encoder data wires were not connected, just taped out of the way, as they were not yet needed for the first test: direction of motor rotation.

    Phoebe TurtleBot Stage 1 PWM

    The Roboclaw ROS node expects the robot’s right side motor to be connected as Motor #1, and the left as Motor #2. It also expects positive direction on both motors to correspond to forward motion.

    I verified robot wiring using Ion Studio, the Windows-based utility published by the makers of Roboclaw. I used Ion Studio to command the motors via USB cable to verify the right motor rotates clockwise for positive motion, and the left motor counter-clockwise for positive motion. I got it right on the first try purely by accident, but it wouldn’t have been a big deal if one or both motors spun the wrong way. All I would have had to do is to swap the motor drive power wires to reverse their polarity.

    (Cross-posted to

  • Test Frame To Help Arrange Phoebe’s Wheels

    Roger09/11/2018 at 17:27 0 comments

    Since Phoebe will be a TurtleBot variant built out of stuff already in my parts bin, these parts won’t necessarily fit together well. The first thing to do is to figure out how to make the wheels work together. A simple test frame will mount Phoebe’s two drive wheels and see how they cooperate. And besides, building a two-wheel test chassis is how I’ve started many robot projects and that’s worked out well so far. So now let’s make another one to follow in the grand tradition of two wheel test chassis built to test parts going into SGVHAK Rover and Sawppy Rover.

    Phoebe TurtleBot Two Wheel Test Frame

    For Phoebe, this simple test chassis established the following:

    • I used a caliper to measure wheel mounting bracket dimensions, and they are accurate enough to proceed. They are spaced the correct distance apart, and their diameter is large enough for M4 bolts to slide through without being so large that the resulting wheel wobbles.
    • The 5mm thick slender connecting members are too weak. The next frame will have greater thickness and generally beefier structure.
    • I wanted a 20cm track. (Left-right distance between wheel centers.) I measured the dimensions for my wheel assembly but the measurements were a little off. Now I know how much to adjust for the next frame.
    • And most importantly: this frame allowed direct comparison of drive wheel resting height against caster wheel height. They were both awkward shapes to measure with a ruler so having the flat surface of a frame makes the measurement easier. Their relative height difference needs to be accounted for in the next frame in order to have a robot body that is level with the ground.

    (Cross-posted to

View all 12 project logs

Enjoy this project?



Humpelstilzchen wrote 09/09/2018 at 07:13 point


which laserscanner do you use in this budget?

  Are you sure? yes | no

Roger wrote 09/09/2018 at 17:36 point

Thanks for finding my project before I even had a chance to fill out all the details! I'll be posting more information soon but I'm happy answer your specific question now: I'm using a laser scanner salvaged from a Neato robot vacuum. Search on eBay for "Neato LIDAR" and you should see several options with "Buy It Now" prices in the $50-$75 range.

  Are you sure? yes | no

Humpelstilzchen wrote 09/09/2018 at 17:38 point

thanks, I'l patiently wait for more details then.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates