Autonomous SLAM with a Roomba

Autonomous robot performing SLAM using a Roomba, Raspberry pi, RPLidar and a laptop for the UI and heavy CPU work.

Public Chat
Similar projects worth following
This project is the natural successor of .The robot is able to explore its surroundings and build a map of it.The robotic platform is a Roomba 630 with an RPLidar and a Raspberry pi.The Raspberry pi relays telemetry including raw lidar data to a laptop, as well as taking motion commands and issuing them to the Roomba. Additionally the Raspberry pi is directly responsible for some basic crash avoidance.All code is written in java.This project was undertaken as a learning experience, and does not use any robotic libraries like ROS.Some of the key parts of the software build are:* Route planning.* Particle filter.* Obsticle avoidance.* Map building, Loop closure and Error correction (SLAM).* Sparse high speed occupancy grid map.

This video explains much of what this robot does.

Standard Tesselated Geometry - 51.06 kB - 04/10/2018 at 22:50


Standard Tesselated Geometry - 527.62 kB - 04/10/2018 at 22:49


Standard Tesselated Geometry - 38.95 kB - 04/10/2018 at 22:49


Standard Tesselated Geometry - 148.42 kB - 04/10/2018 at 22:49


Standard Tesselated Geometry - 74.89 kB - 04/10/2018 at 22:49


  • 1 × RPLidar
  • 1 × Raspberry pi
  • 1 × Roomba 630
  • 1 × Usb battery 10Ah
  • 1 × Usb WiFi

View all 6 components

  • Rebuild video with udoo x86 and D435 realsense

    rlsutton101/20/2019 at 22:01 0 comments

    Here is the rebuild video, with a short demo at the end. I'm still working on some software issues processing the point cloud data from the depth camera.

  • Rebuild video with udoo x86 and D435 realsense

    rlsutton101/20/2019 at 22:01 0 comments

    Here is the rebuild video, with a short demo at the end. I'm still working on some software issues processing the point cloud data from the depth camera.

  • Rebuild video with udoo x86 and D435 realsense

    rlsutton101/20/2019 at 22:01 0 comments

    Here is the rebuild video, with a short demo at the end. I'm still working on some software issues processing the point cloud data from the depth camera.

  • Robot upgrades…

    rlsutton101/04/2019 at 10:54 0 comments

    My robot is getting a serious upgrade.

    Why? I decided that I couldn’t effectively avoid obstacles that the LIDAR couldn’t see using the Astra depth camera because it’s minimum point could range is 60cm.

    This then caused me to find a depth camera with a shorter minimum range and I found the Intel Realsense D435. The D435 has a minimum depth range of 10cm at lower resolutions and 20cm at higher resolutions.

    This led me to the next problem, the Realsense camera requires USB3 and the SDK doesn’t have ARM support either, making the Raspberry PI a problem. This led me to search for a replacement processor that supports USB3 and is X86… and the winner is UDOO X86.

    The UDOO requires 12V at up to 3 Amps, I bought a 12Volt battery power bank…

    I soon realised that the power bank could operate at 12,16 or 19 volts and as it only has a single button to turn it on, off and select the voltage, it would be possible to accidentally select a higher voltage when turning it off. This was an unacceptable risk as the UDOO requires 12V +- 5%, which led to the next search – for a 12 Volt buck/boost regulator.

    The next challenge was a shortage of USB ports I need 4, but the UDOO only has 3 although this was an easy one to solve.

    Given that all my code is written in Java, the move from Raspberry PI to UDOO X86 is effortless for my code. But the Realsense D435 depth camera doesn’t have much in the way of Java support and I had already sunk some effort into integrating with OpenNI2 for the Astra camera. The Realsense libs do support OpenNI2 – this sounded like an easy fix, but as it turns out it requires manually building the OpenNI2 & Realsense SDK. I cheated a little and used the binaries for the OpenNI2 build, which caused me some other problems but after about 3 or 4 hours and a firmware update for the Realsense, it was working.

    The next problem is how to house all this new hardware

    I’m building a new platform for the hardware to sit on top of the Roomba. As the Realsense is much smaller I’ll be able to condense it into just 2 layers. I started with a couple of MDF cake bases from the local bargain shop. I added a slot into the lower one to allow access to the Roomba’s buttons. I’ve also used some cupboard handles to act as stand offs to hold the platform above the Roomba with enough clearance for the Roomba Serial din plug, I didn’t want to bring the din plug though the platform to allow more clear space for all the hardware to sit on the bottom platform.

    To be continued…

  • Bezier curves for smooth paths and obstacle avoidance

    rlsutton112/16/2018 at 10:54 0 comments

    The navigation system was previously creating paths with abrupt turns in them, in this video you can see that I've used Bezier curves to smooth out the paths.

    Although it doesn't quite get it right first time, there is an additional short term route planner (the red line) designed to plan around local obstacles that are not present in the map. This planner re-plans the next few meters every second.

    There is one further improvement here, the robot no-longer navigates in local maps and as a result doesn't have to stop constantly to switch maps.

    The robot still drives quite conservatively (slow) when near obstacles, as can be seen.

  • Navigator refinement and Particle filter fix

    rlsutton109/30/2018 at 06:34 0 comments

    In this video you can see that the route planner (left side) plans a route staying well away from walls, corners and other obstacles. 

    The regular pauses in the robots travel happen when it swaps out the particle filter map (right side) for next local map in the path.

    Towards the end of the video you can see the particles (right side, red dots) of the particle filter spread out when the robot is travelling quickly (50cm/second) down the hall way. This is the maximum documented speed of the Roomba.

    The map was built on the fly (SLAM) although I didn't include that in this video.

    In the past few weeks I've identified a bug in the particle filter, where the re-sample was done before the update leading to continual jitter in the output of the particle filter.

    I also spent some time improving the route planner, it now correctly calculates a cost with relation to passing near or nearer to obstacles, resulting in a path which does a good job of staying away from obstacles, walls and corners.

    I still intend to add another layer on top of the route planner to take into account the latest Lidar scan and navigate around transient objects.

  • Orbbec Astra depth camera

    rlsutton107/01/2018 at 22:39 0 comments

    I have now written code to acquire data from the Orbbec Astra depth camera.

    My initial experiments were with Orbbecs SDK, and after about 6 hours was getting usable data via the Java API. I posted those details to the orbbec forums.

    Alas I moved my code to the Raspberry pi, only to discover there is currently no support for Arm/Linux in the Orbbec SDK.

    Next i moved to OpenNI2 and after about 4 more hours, i had usable depth data on the Raspberry pi. I also posted that code to the Orbbec forums.

    A nice thing the astra does by default is remove the ground plane which greatly simplifies data processing.

    I took the depth data and flattened it buy removing the height data, giving effectively a top down view similar to LIDAR.

    The Astra can not give depth data any closer than 60cm, which will really stop me from using it the way i intended. Fortunately i acquired the used Astra cheaply, so i have ordered an Intel Realsense that can give depth data as close as 10cm.

  • Mapping Improvements

    rlsutton105/15/2018 at 11:30 0 comments

    This week I managed to find a bug in the code which initializes the particle filter when moving between maps.

    This has enabled extended runs without the robot becoming lost in the simulator, and my general experience is that the real robot actually performs better due to the greater detail available in the real world.

    Progress has also been made on placing constraints between maps, although it is appearing that the positional accuracy when moving between maps is worse that when creating them. As a result I'm considering only adding additional constraints between maps with very distant constraints such as for loop closure.

  • Graph SLAM trials

    rlsutton104/30/2018 at 23:03 0 comments

    I have wanted to use Graph SLAM for a long time now but got bogged down trying to detect features to supply to Graph SLAM.

    I've tried Hough transforms to find lines, also wrote various versions if my own corner detection algorithms.  I even used deep nets to detect corners and achieved 95% accuracy.

    But all that still didn't amount to a viable set of features.

    I then hit on the idea that a feature could actually be an entire map, which is what has lead to the current technique of the robot creating sub maps which it localizes in and moves between.

    I also had problems getting my implementation of Graph SLAM working with an X Y THETA pose.

    Ultimately I resorted to a watered down implementation of Graph SLAM implemented without using matrices. It doesn't attempt to model the position of the robot, but rather just the relationships between the features.

    Hopefully in the next few weeks I'll be able to integrate my Graph SLAM into the build and finally see a map that improves as the robot traverses the map. 

  • Map Building walk through

    rlsutton104/25/2018 at 01:13 0 comments

    In this video I explain the high level process of building a map as it runs around a section of my house building a map.

View all 13 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates