Close
0%
0%

Wild Thumper based ROS robot

My ROS (Robot Operating System) indoor & outdoor robot

Similar projects worth following
I've build a few robots before, but never was able to do real navigation with them, e.g. drive successfully from living room to kitchen. Reason was that a few infrared and sonar sensors are not enough to do 'real' navigation. You can avoid obstacles with them, but not much more. My Software was not much better suited for this until I discovered ROS (Robot Operating System) with the rviz GUI a few years ago. Since ready available robots with ROS are a little bit expensive I decided to build a new robot in spring 2015 around the wild thumper 4wd chassis. The hardware and electronics were almost completed in a few weeks. Since then I've spent some time with tuning..

The goal is to be able to navigate indoor doing SLAM (simultaneous localization and mapping) and outdoor with GPS.

The free space on the back does currently host a manipulator arm.

Mechanic:

  • Wild Thumper 4wd chassis
  • Motors upgraded with encoders
  • Total weight: 3.3kg

Power supply:

  • Battery: 2x 7.2V NiMh, fused with 30A (slow)
  • 5V via voltage regulator D24V50F5 (5A), fused with 3A (fast)

The two batteries are connected in parallel using a LM5050-2 active ORing circuit each. Another LM5050-2 can be connected in parallel for docking station supply.

Computer:

  • Solid Run Hummingboard (i.MX6 ARM Cortex-A9 Dual Core 1GHz, 2GB RAM)
  • AVR Atmega32 for motor control
  • AVR Atmega328 (Arduino Nano) for I/O

Peripheral:

  • Hummingboard: GPS (uart), IMU (USB), 3D-Camera (USB), 2xAVR over I2C, PCA9517 "Level translating I2C-bus repeater" to bridge the 3.3V with the 5V I2C.


Motor control:

  • Motors driven by 4x VNH2SP30, one for each on a 20kHz PWM
  • Speed control (PID) and Odometry from wheel encoders are calculated on Atmega328 (yes, doing float on it).

Inputs/Outputs on Atmega328:

  • 3x distance sonar sensors, 2x infrared distance sensors, battery voltage

Odometry calculation:

  • Odometry from wheels corrected with Tinkerforge IMU Brick 2.0 with Kalman filtering

Sensors:

  • Xtion Pro Live depth camera
  • 2x IR 2D120X (1x left, 1x right)
  • 3x sonar SRF05 (2x front, 1x aft).

The point of the sonar sensors is to correct the dead zone of the depth camera in less then 0.5m

Software:

  • Debian Stretch
  • Robot Operating System (ROS) Kinetic

Object following:

With ultra-wideband (UWB) modules the robot can follow a target, in the following video a R/C car:

Details in the corresponding log "Follow me - part 4".

GPS test video:

The following videos shows the robot automatically driving a square by four GPS waypoints. The front camera is shown on the lower left, the rviz map video is shown on the upper left.

LED stripe demo:

  • 1 × Wild Thumper 4wd
  • 2 × Battery NiMh 7.2V 5000mAh
  • 2 × LM5050-2 board active ORing Custom board
  • 1 × D24V50F5 5V/5A voltage regulator
  • 1 × Solid Run HummingBoard-Pro + i.MX6 Quad SoM with W-Lan & Bluetooth

View all 10 components

  • Get the robot to bring me beer

    Humpelstilzchen09/07/2021 at 14:37 0 comments

    One of the things I want my robot to do is to bring me beer when I'm retired and sitting on the couch. This involves the following actions:

    1. Drive to the fridge
    2. Open the fridge
    3. Locate a bottle of beer
    4. Fetch the bottle
    5. Close the fridge
    6. Drive to me
    7. Hand me the bottle

    My wild thumper robot can already drive autonomously to any location in the apartment. Also thanks to MoveIt I can position the manipulator to any given x/y/z-position that is physically accessible. The current task is to detect, locate and grab a bottle. The detection is done with a YOLO Real-Time Object Detection neural network, the 3d location is then determined from the distance given by the 3d camera and the grabbing is done with MoveIt. For this the following ROS configuration is used:

    The realization with the Realsense 3d-camera is done in the following steps:

    1. Use a neural network (NN) to detect a bottle in the RGB-image of the 3d camera. The NN runs on a stationary computer because the robot does not have a NN accelerator. The result is a box with row, column, width and height.
    2. Calculate the middle pixel coordinate and get its distance from the depth image. For this the depth image needs to be aligned to the RGB-image. The realsense node can do that automatically.
    3. With the known orientation of the 3d camera use projectPixelTo3dRay() to transform these values in goal coordinates x/y/z, relative to the robot.
    4. Position the gripper of the manipulator 5cm before the goal
    5. Open the gripper
    6. Position the middle of the gripper on the goal
    7. Close the gripper
    8. Move the manipulator to a final position.

    The resulting MoveIt script is in the Git repository and the video shows what it looks like:

  • Higher accuracy speech recognition in ROS with Vosk

    Humpelstilzchen03/20/2021 at 18:15 0 comments

    Following the previous log entry Speech recognition in ROS with PocketSphinx the recognition of speech was okay (~90% of words correctly) but not good. Also PocketSphinx is a little dated and its developers are now working on Vosk instead, which itself uses Kaldi.

    With the Vosk server there is an easy to use Websocket API. I use it with the language model vosk-model-small-en-us-0.15 which is optimized for embedded systems:

    python3 ./asr_server.py /opt/vosk-model-small-en-us

    As with PocketSphinx accuracy can be improved by using  a fixed set of words to recognize, so I will whitelist words like "forward, backward, stop, left, right, ...". The API does not allow to set a complete grammar, so sentences like "left right" can theoretically be detected, but in contrast to PocketSphinx the accuracy of detected words with Vosk is good enough so a grammar is not really needed.

    Since the Vosk Server is using websockets I skip GStreamer and pipe the 16 bit audio with a sample rate of 16000 Hz from the ReSpeaker Mic Array microphone with SoX directly in my ROS publisher script:

    rec -q -t alsa -c 1 -b 16 -r 16000 -t wav - silence -l 1 0.1 0.3% -1 2.0 0.3% | ./asr_vosk.rb -

    With the silence command SoX is told to filter periods of silences longer then 2 seconds, so the Vosk server does not have to process these.

    The ruby script asr_vosk.rb is based on the PocketSphinx one in the previous log, but the GStreamer/PocketSphinx parts are replaced with websockets and Vosk. I use websocket-eventmachine-client library for websocket handling in ruby.

    #!/usr/bin/ruby
    
    require 'logger'
    require 'websocket-eventmachine-client'
    require 'json'
    require 'ros'
    require 'std_msgs/String'
    
    KEYWORDS = ["wild thumper"]
    CONFIG = {
        "config": {
            "phrase_list": ["angle", "backward", "by", "centimeter",
                            "compass", "current", "decrease", "default",
                            "degree", "down", "eight", "eighteen", "eighty",
                            "eleven", "fifteen", "fifty", "five", "forty",
                            "forward", "four", "fourteen", "get", "go",
                            "hundred", "increase", "left", "light", "lights",
                            "meter", "mic", "minus", "motion", "mute",
                            "nine", "nineteen", "ninety", "off", "on", "one",
                            "position", "pressure", "right", "secure", "set",
                            "seven", "seventeen", "seventy", "silence",
                            "six", "sixteen", "sixty", "speed", "stop",
                            "temp", "temperature", "ten", "thirteen",
                            ...
    Read more »

  • Adding a Manipulator

    Humpelstilzchen06/04/2020 at 14:10 0 comments

    The WIld Thumper robot now has a 4 degrees of freedom (DOF) Open Manipulator with a RealSense D435 depth camera attached:

    The manipulator can be extended to 6 or more DOF if necessary. Communication with the Dynamixel servos is done with the AVR half-duplex software UART library by Ralph Doncaster on an Attiny85 that translates the commands from the I2C bus. The inverse kinematic of the Open Manipulator is calculated with IKFast and is controlled with MoveIt:

  • Path Coverage/Cleaning Algorithm

    Humpelstilzchen04/04/2020 at 09:03 0 comments

    I wanted to get my Wild Thumper to clean the floor so I attached a microfoam brush to the front:

    So the robot should clean the floor by fully covering a given area with simple back-and-forth boustrophedic motions. Boustrophedon means "the way of the ox", like a plow that is dragged by an ox in the field:

    The area to clean has to be broken down into cells where each cell can be covered by these simple motions. Applications include cleaning and mowing. One algorithm for this is the Boustrophedon Cellular Decomposition which is explained in the linked paper:

    To execute this algorithm with a robot I have created a ROS node which resembles the output of the Boustrophedon Cellular Decomposition on a given global costmap and executes the path coverage using the ROS navigation stack:

  • Improving Wild Thumper suspension

    Humpelstilzchen03/22/2020 at 15:06 0 comments

    In a previous log I mentioned that the wheel suspension of my Wild Thumper was too weak to handle a 1.5l bottle as load. Indeed the suspension can be strengthened but this requires to disassemble the full chassis. Since I was not very happy with this situation I replaced the default springs with some tuneable damper from RC-cars. The ones I bought are labeled "Dual Spring 80 mm Scale Black Shocks":

    I replaced their default springs with some stronger ones and assembled them into the chassis. Now the suspension can be adjusted a bit easier:

  • Generating a Wi-Fi heatmap

    Humpelstilzchen12/26/2018 at 16:19 0 comments

    Something useful for my robot to do: Having troubles with the Wi-Fi at my parents house I decided to create a heatmap of the W-Lan signal strength with the inspiration of this ROSCon 2018 talk. Since I did not understand QGIS in a few minutes and because my use case was rather limited I created the map based on OpenLayers instead. An example can be seen in the following image created for my small place:

    Given an already created floor plan with SLAM I drove my robot around and took a measurement of the Wi-Fi signal strength every 0.5 second. The signal strength ranges from green (very good) to red (very poor). Since I do not have any problems with my Wi-Fi all dots are more or less green for my place. From these discrete measurements a contour plot is generated. In this contour plot it can be easily seen that the access point is in the lower left corner (darkest green).

    The steps to reproduce are the following:

    1. Drive around the apartment and take a measurement twice a second. The measurements are saved in a JSON file. This is done with the following quick & dirty python script:

    #!/usr/bin/env python
    # -*- coding: iso-8859-15 -*-
    
    import os
    import re
    import rospy
    import tf
    import tf2_ros
    from time import sleep
    
    rospy.init_node('wifi_strength')
    
    tfBuffer = tf2_ros.Buffer()
    listener = tf2_ros.TransformListener(tfBuffer)
    regex_lq = re.compile("Link Quality=(\d*)/(\d*)")
    while not rospy.is_shutdown():
        f = os.popen("export LANG=C; /sbin/iwconfig wlan0")
        for line in f:
            line = line.strip()
            match_lq = regex_lq.match(line)
            if match_lq is not None:
                lq = float(match_lq.group(1)) / float(match_lq.group(2))
                pos = tfBuffer.lookup_transform("map", 'base_link', rospy.Time(0), rospy.Duration(1.0))
                print '{"x":%.2f, "y":%.2f, "link":%.2f},' % (pos.transform.translation.x, pos.transform.translation.y, lq)
                sleep(0.5)
    

    The python script does the following

    • Retrieve the current position with ros/tf
    • Read the signal strength of the connected access point with a regular expression from iwconfig (value: Link Quality)
    • Append position (x,y) and Link Quality to a JSON array, example:
      [
      {"x":0.32, "y":-0.15, "link":0.69},
      {"x":0.34, "y":-0.12, "link":0.73},
      {"x":0.38, "y":-0.05, "link":0.74}
      ]

     2. Generate a heatmap with matplotlib contourf

    The data is first interpolated from the measurements with 300 steps in each direction from minimum to maximum with linspace and griddata, the heatmap is generated with contourf. The size of 10x10 meter easily covers my whole apartment.

    #!/usr/bin/env python
    # -*- coding: iso-8859-15 -*-
    
    import json
    import matplotlib.pyplot as plt
    import numpy as np
    from scipy.interpolate import griddata
    
    # settings
    size = (10, 10) # meter x meter
    contour_steps = 300
    resolution= 0.025 # m/px
    
    # ----------
    
    lx = []
    ly = []
    llink = []
    dpi=100.0/(resolution*100)*2.54 # scale 100dpi to our resolution in m/px
    with open("wifi_strength.json", "r") as read_file:
        data = json.load(read_file)
        for line in data:
            lx.append(line['x'])
            ly.append(line['y'])
            llink.append(line['link'])
    
        # for a square image both sizes must be equal
        minimum = min(lx) if min(lx) < min(ly) else min(ly)
        maximum = max(lx) if max(lx) > max(ly) else max(ly)
        if abs(minimum) > abs(maximum):
            maximum = -abs(minimum)
        else:
            minimum = -abs(maximum)
    
        # figure without margin
        fig=plt.figure(num=None, figsize=size, dpi=dpi)
        ax=fig.add_axes((0,0,1,1))
        ax.set_axis_off()
        ax.set_xlim(-size[0]/2, size[0]/2)
        ax.set_ylim(-size[1]/2, size[1]/2)
    
        # map generation
        xi = np.linspace(minimum, maximum, contour_steps);
        yi = np.linspace(minimum, maximum, contour_steps);
        zi = griddata((lx, ly), llink, (xi[None,:], yi[:,None]), method='linear')
        ax.contourf(xi, yi, zi, cmap="RdYlGn", vmin=0, vmax=1)
    
        plt.savefig('heatmap.png', transparent=True)
    ... Read more »

  • Speech recognition in ROS with PocketSphinx

    Humpelstilzchen12/08/2018 at 18:12 0 comments

    Current goal is to issue voice commands to the Wild Thumper robot. The speech recognition engine PocketSphinx was chosen for this task because it works with little CPU and memory. Since there does not seem to be an up to date ROS node for Pocketsphinx I decided to write a simple one. Pocketsphinx includes a GStreamer element, so the modular GStreamer Framework can help with the audio processing.

    In GStreamer complex tasks like playing a multimedia file is performed by chaining multiple elements to a pipeline. Each element executes a single task, e.g. read a file, decompress data or output data to a monitor.

    Pocketsphinx requires the audio to be 16 bit little endian mono at a rate of 16000Hz. The ALSA plughw interface can provide the input from the USB microphone in this format, so "plughw:1,0" is used as input device.

    Pocketsphinx is used in two voice recognition modes:

    1. Keyword detection

    The speech recognition should only react to commands when addressed by name, for example "wild thumper stop", not just "stop" because the robot should not react when e.g. a movie is running in the background where someone says "stop". Also the robot shall only react to its exact name, not something sounding similar. Pocketsphinx provides a keyword spotting mode for this use case. Input to this mode is  the file keywords.kws with a threshold for each keyword:

    wild thumper /1e-11/

    2. Fixed grammar

    After spotting the keyword, Pocketsphinx shall recognize a command like "stop", "go forward one meter", "backward", "turn left" or "get voltage". Pocketsphinx is run with a given grammar in the Java Speech Grammar Format (JSGF) format to avoid a spoken "go forward" accidentally getting recognized as "go four" (yes, this happens a lot). Since "go four" is not allowed in the grammar it is discarded. This increases the recognition accuracy from ~40% to ~80%. As of today the jsgf option of the Pocketsphinx GStreamer element is only supported in unreleased git, so it needs to be compiled from source. The robot.jsgf looks like this:

    #JSGF V1.0;
    
    grammar robot;
    
    <bool> = (on | off);
    <number> = minus* (zero | one | two | three | four | five | six | seven | eight | nine | ten | eleven | twelve | thirteen | fourteen | fifteen | sixteen | seventeen | eighteen | nineteen | twenty | thirty | forty | fifty | sixty | seventy | eighty  | ninety | hundred | thousand | million);
    
    <misc_command> = (light | lights) [<bool>];
    <engine> = (stop | forward | backward | increase speed | decrease speed);
    <get> = get (temp | temperature | light | voltage | current | pressure | mute | mic | silence | speed | velocity | position | angle | compass | motion | secure | engine | odom | humidity);
    <go> = go (forward | backward) <number>+ (meter | meters | centimeter | centimeters);
    <turn> = turn (left | right | (to | by) <number>+ [(degree | degrees)]);
    <speed> = set+ speed <number>+ | set default speed;
    
    public <rules> = <misc_command> | <engine> | <get> | <go> | <turn> | <speed>;
    

    As acoustic model the default U.S. English continuous model of Pocketsphinx is used together with a MLLR adaption for my specific accent and microphone. According to tests with word_align.pl this improved the recognition accuracy to over 90%.

    Pocketsphinx ROS node

    The GStreamer pipeline in the ROS node uses two Pocketsphinx elements, one for the keyword spotting mode, one for the JSGF grammar mode. A preceding "cutter" element suppresses low background noise. The valve before the JSGF grammar node...

    Read more »

  • serving beverages etc

    Humpelstilzchen10/28/2018 at 08:26 4 comments

    The robot can also serve beverages :)

    For details see comments:

    In the following video the robot was driving on manual with a linear speed of 20cm/s and an angular speed of 0.5 rad/s.


    Possible enhancements include:

    - Add weight sensors to measure the load

    - Add an accelerometer to the serving plate and linear actuators to its leg to keep the load horizontal.

  • Honoring limits

    Humpelstilzchen10/20/2018 at 12:09 0 comments

    When the Wild Thumper Robot is driving around my home there are some areas where it shouldn't go because of e.g. cables lying around which are too thin for the sensors to spot. Same for my Neato Botvac robot vacuum. The Neato people solved that by the use of a magnetic strip. The vacuum is not allowed to driver over it. I want my Wild Thumper to honor the same limits. Problem is the relative high distance of 6-7cm between the sensor position and the ground:

    The distance is too high for usual magnetic sensors like reed switches or Hall effect sensors, but not high enough for a magnetometer since they need to be sensitive enough to measure the earth magnetic field for their use as compass sensors. Luckily I still had an old CMPS10 IMU lying around which does the job perfectly. From now on the Wild Thumper can stop at the same areas as the vacuum cleaner.

  • Checking I2C pull-ups

    Humpelstilzchen10/03/2018 at 08:06 0 comments

    TL;DR: Always check your signals with an oscilloscope!

    Long version:

    I wanted to try out rosruby, a ROS wrapper for the ruby programming language, so I decided to port the ROS node for the LPD8806 LED strip of this robot to it. This strip uses a somewhat crippled SPI with only MOSI and SCK to control the lights. While testing the new node the I2C bus immediately and repeatable fails. Since I2C is used as the connection to e.g. the motor controller, the bus is critical for operation. This has not happened before. After ruling out power issues this smelled like a problem with electromagnetic interference, which is why I checked the SPI signals (yellow: SCK, green: MOSI) with an oscilloscope:

    While there are some spikes in the SPI signal and I'm certainly missing the line impedance matching resistors for termination of the bus I don't consider the signal bad enough to disrupt a healthy I2C.

    So I looked at the I2C bus instead which resulted in the following horrific picture (yellow: SCL, green: SDA)

    This by any means is matching more a saw tooth then a clean rectangular signal. It is obvious that the value of 5k Ohm for the pull-up is way too high in this robot. After replacing both resistors with 1k ones the signal is now much better (yellow: SCL, green: SDA):

    Also the I2C no longer fails when sending data on the LPD8806 SPI.

View all 22 project logs

Enjoy this project?

Share

Discussions

ntrewartha wrote 07/21/2020 at 14:47 point

An interesting project that I will have a much close look at and modify it. Perhaps using a Raspberry Pi 4 + arduino IO .  Have you thought of using Lidar ??

But Lidar modules can be rather expensive.  Keep up the excellent work!

  Are you sure? yes | no

Humpelstilzchen wrote 07/21/2020 at 17:00 point

Thank you, I'm actually currently using a RPlidar A2, as can be seen on some of the newer images:

https://cdn.hackaday.io/images/6305991591275227537.4c90b729d9d8c74c82f013098fd80e2d

Sure, Rpi4 + Arduino will work as good. Personally I have discarded the Rpi4 because of its high power consumption. I'm currently evaluating the Odroid-C4 instead.

  Are you sure? yes | no

nadiahalh wrote 01/07/2020 at 11:43 point

hallo, how can you get the odometry data form wheel encoder? would you mind to share the code for getting the odometry? thank you

  Are you sure? yes | no

Humpelstilzchen wrote 01/07/2020 at 17:14 point

The actual code to read the gray code of my motors is in function update_hall1() of main.c which will give you the distance driven by each wheel. To convert that to 2d odometry see update_pos().

  Are you sure? yes | no

Stewart Teaze wrote 05/28/2019 at 21:01 point

Apparently, the VNH2SP30 motor driver has been discontinued; from https://www.pololu.com/product/706:

Note: The VNH3SP30 motor driver used by this board has been discontinued by ST. We strongly recommend our carrier board for the newer and better VNH5019 as an alternative.

Here is Pololu link to VNH5019 motor driver carrier board ($24.95): https://www.pololu.com/product/1451

or, perhaps better, the Pololu Dual VNH5019 Motor Driver Shield for Arduino ($49.95): https://www.pololu.com/product/2507; so, for an initial design, control each side (left wheels, right wheels) with one channel of this motor controller.

  Are you sure? yes | no

Humpelstilzchen wrote 10/14/2019 at 18:28 point

Sorry, for an unknown reason I missed your comment. Yes, one motor driver for each side is usually enough. The extra drivers are only needed in special situations, e.g. with mecanum wheels

  Are you sure? yes | no

Rud Merriam wrote 08/06/2018 at 16:40 point

Would like to see the code for this. Can you share it, please?

  Are you sure? yes | no

Humpelstilzchen wrote 08/06/2018 at 16:57 point

The project is based on the Robot Operating System, so most code comes from there, everything else (e.g. the ROS configuration files, the AVR code) is in my git repository https://defiant.homedns.org/gitweb/?p=ros_wild_thumper.git

  Are you sure? yes | no

Rud Merriam wrote 08/06/2018 at 18:52 point

How can I clone it from that URL?

  Are you sure? yes | no

Humpelstilzchen wrote 08/06/2018 at 19:18 point

  Are you sure? yes | no

Capt. Flatus O'Flaherty ☠ wrote 07/03/2018 at 08:27 point

Have you thought of burying RFID tags under your carpet for indoor navigation? I wonder if it would work?

  Are you sure? yes | no

Humpelstilzchen wrote 07/04/2018 at 07:30 point

Havn't thought of that yet, I'm using SLAM for indoor navigation. Also I'm currently testing DWM1000 modules.

  Are you sure? yes | no

Capt. Flatus O'Flaherty ☠ wrote 07/04/2018 at 08:12 point

Searching with google I only find one instance of it being used: https://www.instructables.com/id/Robotic-Indoor-Navigation-With-RFIDNFC-Tags/

  Are you sure? yes | no

evansgp wrote 07/01/2018 at 05:45 point

Hi! Your project is awesome and I'm inspired to do something similar as my next project. One question: I was looking at the Wild Thumper chassis and it says the motors that have built in encoders aren't supported. Which ones are you using? Did you have to do anything special for them to fit?

Edit: Oh, I think I found an answer to my own question: https://www.pololu.com/file/0J571/Wild_Thumper_Encoder_Modification.pdf

  Are you sure? yes | no

Humpelstilzchen wrote 07/04/2018 at 07:28 point

Thanks and the pdf is correct, you will need an  additional hole. Btw if you get the 6wd base you will only need encoders for the 2 center motors.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates