FPGA based robot vacuum cleaner

3D-printed cleaning robot based on an fpga.

Similar projects worth following
3D-printed cleaning robot which is based on an fpga. The robot has several sensors to detect obstacles and a suction unit to clean floors. In addition, the robot contains a navigation algorithm that enables it to drive off rooms as efficiently as possible and react to obstacles accordingly.

I am working on an autonomous robot vacuum which is able to clean floors. I try to incorporate the following aspects into the design:

  • An important component of the design is functionality. The robot should be able to make a real contribution to a clean floor.
  • Intuition. Intuitive operation and high functionality lead to frequent use of the device. This behavior helps to save time to use it for something more important than floor cleaning.
  • Safety. This aspect has major priority. The robot should not accidentally fall downstairs or cause a short circuit.
  • Fun. This is the main reason why I started this project a half year ago. I am learning a lot by working on this vehicle. Furthermore, it is interesting and challenging to solve technical problems.

I am trying to publish every week a new article with updates in the log section.

Adobe Portable Document Format - 57.99 kB - 04/30/2020 at 18:17


View all 6 components

  • WebApp with Bluetooth low energy connectivity

    Lars M.05/20/2020 at 09:23 0 comments

    The real user interface

    As you can see in pictures of the robot, there are a few control elements on the roof of the vehicle which form an interface between the technology and user. In short, the user can give commands to the robot via the user interface, i.e. tell it what to do. The robot's user interface is kept very simple, so that hardly any previous knowledge is required. The vehicle has only one switch which can be used to turn the device on or off. Additionally, there is a grey pushbutton to restart the robot. The Status of the battery can be indicated by the green and red LEDs. The last element of the interface is a yellow LED which lights up in case of certain errors.

    The virtual user interface

    So far so good. However, there are several settings that can theoretically be made available to the user. As an example, the user could adjust the power of the fan himself so that a more individual operation can be guaranteed. The user could adapt the robot better to his own needs or circumstances. But since there isn’t enough space, no other buttons or switches can be mounted on the robot to add further functions.

    A solution to this problem is to use devices that almost every user owns. A mobile phone or PC, for example. Because of this I decided 2 weeks ago to program a web application which can be seen as another user interface. To communicate with the robot, Bluetooth low energy (Ble) radio technology is used. To receive data, an HC-08 Ble module is integrated in the robot.

    The idea for this project came to me through a presentation on the "Google Developers" YouTube channel. There is a detailed report about Physical Web and Bluetooth web applications.

    The website can therefore be used to send and receive commands to and from the robot via Bluetooth. The web application can be accessed in the browser. To communicate with the robot, you must activate the location and Bluetooth on your smartphone. 

    The website can therefore be used to send and receive commands to and from the robot via Bluetooth. The web application can be accessed in the browser. To communicate with the robot, you must activate the location and Bluetooth on your smartphone. 

    With this second interface, further settings can be defined by the user. After login, the user is redirected to the start page. First, there are the "Connect" and "Disconnect" buttons, which can be used to initiate or terminate a Bluetooth communication. Below the Bluetooth buttons the battery capacity is displayed. In order to control the fan power, there is a slider under the status bar. With the two "Controlbutttons" the main functions of the robot can be managed. Each of the buttons has two functions: The left button is used to switch off the robot. When the button is pressed, its text field changes over to " enable autopilot". If you push the button a second time, the robot can be started again without the need for multiple buttons in the same application. This fact also applies to the "Fan off" button, as it allows the fan to start again after deactivation.

    It should be considered that the available code works well but is improved from time to time. The latest version will be available on Github soon.

    The HTML code for the start page is as follows:

    <!DOCTYPE html>
        <meta charset="utf-8">
        <meta name="viewport" content="width=device-width, initial-scale=1">
        <link href="styles.css" rel="stylesheet">
    <h1>Proto Lite UI</h1>
    <button id="connect" type="button">Connect</button>
    <button id="disconnect" type="button">Disconnect</button>
    <hr width="55%" />  <!--Trennstrich-->
    <div id="terminal"></div>
    <p  id="target">Battery level: not connected</p>
    <hr width="55%"/>
    <h4> Fan power</h4>
    <input type="range" min="0" max="10" value="5" class="slider" id="sliderAmount" onchange="updateSlider(this.value)">
    <h4>Control buttons</h4>
    <input type="button" id="launch" value="Stop"...
    Read more »

  • Ultrasonic sensors for obstacle detection

    Lars M.05/07/2020 at 15:42 0 comments

    Today I want to continue with the basic system of the robot.

    The article is structured as follows:

    • Basics
    • The Sensor
    • Communication and behavior


    Ultrasonic sensors can be used for distance measurements. They can detect objects without touching them and then output the distance from the object to the sensor. Therefore, the ultrasonic sensor generates a high-frequency sound wave of approximately 40 kHz. Ultrasonic waves spread out at an approximate speed of 343 m/s, which is roughly the speed of sound. Another characteristic of a sound wave is, that it is reflected after a collision with an object. This creates an echo that can be detected by the sensor.

    The distance to the object can now be calculated from the time between sending and receiving the frequency.

    s = (v*t)/2

    s = distance in m

    v = velocity in m/s = Speed of sound(343,2m/s)

    t = time in s

    The sensor

    In the robot two HC-05 ultrasonic sensors are used. This sensor is characterized by a distance range from 2cm to 400 cm. The measurement result has a maximum deviation of +-3mm. A complete measuring interval has a duration of 20ms. This results in a maximum rate of 50 measurements per second. In addition, the sensor is also specified by a current consumption of about 15mA in normal operation. 

    Based on these characteristics, it can be said that the module is suitable for applications in a robot.

    HC-SR04 ultrasonique de Distance Télémètre / Obstacle Detection ...

    Read more »

  • Project Update​ 28.04.2020

    Lars M.04/28/2020 at 07:24 0 comments

    This is only a short update about the current status of this project. 

    Right now, I am fixing a few flaws in the design files of the robot.

    First, I changed the suction unit, because a lot of air still leaked out in some places. In addition, the suction opening underneath the robot was enlarged, because long hairs were still partly stuck. Then I decided to use a LiPo instead of a lead battery. I have written down a few important aspects which should be considered when deciding.

    The main reason is that LiPo batteries are way lighter than lead batteries. For comparison, the currently installed lead battery weighs about 890g. It has a capacity of 2300mAh. The LiPo battery I want to use weighs only 219g and has a capacity of 2600 mAh.

    Furthermore, the LiPo battery is only 2/3 the size of the lead battery.

    Also an important note is, that LiPo batteries are somewhat more expensive than lead batteries. But the price difference between the lead battery and the LiPo battery for the robot is only about 5 euro. That’s why I think the price difference is manageable.

  • ​ Autonomous driving attempt #1

    Lars M.04/18/2020 at 10:07 0 comments

    The video in this log shows the first regular test of the autonomous driving unit during the suction process. For the test track I used some old boxes and crates as walls. Normally there are also free-standing objects in a room, but these were not considered in this test.

    For obstacle detection, the vehicle uses only the two ultrasonic sensors mounted on the front. Because they can't see everything wheel encoder were attached to both wheels. They are based on four small neodymium magnets which are attached in a circle. A Hall sensor detects the magnetic fields and emits an electrical signal accordingly. The FPGA counts the wheel movements and can therefore determine how far the robot has turned.

    Since only 2 ultrasonic sensors are used, I had to think of a way to detect the objects in the blind spot. If the robot collides with an object that is outside the range of the ultrasonic sensors, the robot stops. Due to the weight of the robot, the wheels are no longer able to turn. That’s why I have implemented a deadlock detection in the fpga.

    The fpga stores the current and previous value of the hall sensors in flip flops. If both values remain the same, a counter is started. If the output of the counter exceeds a certain value and the values of the hall sensors have not changed, it can be assumed that the robot is not moving. Then the wheel condition is compared with the fsm for the motor control. If these two states are contradictory, the robot is stuck.

  • Project introduction

    Lars M.04/17/2020 at 12:02 0 comments

    Project history

    As I mentioned in the project overview, this project consists already for a while. Right now, I am currently working on Proto Lite - the third-generation prototype. The first version was a test prototype for autonomous driving and sensor testing. The second prototype, on the other hand, was used to investigate the cleaning properties.

    The third version consists of the main functions from the prototypes. Simply put, the robot should clean and drive. (There are of course other requirements, such as those from the project overview.)


    The electronics are based on an FPGA, which executes all processes to control and monitor the behavior of the robot. It communicates with the motor drivers, controls the sensors, evaluates their data and converts them into useful information. This information can then be used to draw conclusions about the behavior of the robot under certain situations such as obstacle detection.

    Because you don't program an FPGA (like an Arduino), the robot has no software program. Instead, a bitstream file containing the circuit and all other specifications (clock, IOs) is loaded into the FPGA's configuration memory. The FPGA is then configured to create the desired circuit.

    I have written a code in the hardware description language VHDL which describes the circuit of the on-board computer. This code can be synthesized and loaded into the configuration memory of the FPGA, so that the FPGA emulates the circuit. The code is still under development, as some functions are still being added.


    I constructed the body of the robot in Fusion 360 and 3d printed all parts. Moreover, I am using almost PLA for the prints. An exception is the profile of the wheels, which is based on TPU. Because there are still a few bugs to be fixed, the design files aren’t available right now. For 3d printing I am using a Prusa i3 MK3S.

    This is just an abstract and basic overview about the project. In further articles I will go into detail and try to document my tests and experiences so that they do not only benefit me alone. 

    If you have suggestions, improvements or whatever, feel free to leave a comment or contact me.

View all 5 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates