Close
0%
0%

A Self-Driving Car using a Raspberry Pi Zero

Using Ogma Corp's EOgmaNeo machine learning library, we created a tiny vision-based self-driving car, powered by a Raspberry Pi Zero.

Similar projects worth following
Using Ogma Corp's EOgmaNeo machine learning library, we created a tiny vision-based self-driving car, powered by a Raspberry Pi Zero and weighing 102g. It learns online from the user in real-time, and then drives on its own with the flick of a switch!

This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Contact Ogma via licenses@ogmacorp.com to discuss commercial use and licensing options.

Overview

This project describes how to build a micro Self Driving Car (uSDC), a smaller version of Ogma Corp's 1/10th scale R/C based self driving car. Both versions use a controller to drive the SDCs and a front-facing camera, to provide input into an online Machine Learning predictive algorithm that learns how to autonomously drive around a track. Details of both version can be seen on GitHub here.

The Machine Learning algorithm uses an online learning technique known as Sparse Predictive Hierarchies (SPH). The predictive hierarchies take as input streaming data, in this case a camera image and steering information, and predicts (infers) what the next steering information it expects to be shown. The predicted steering information can also be fed back into the predictive hierarchy to enable autonomous driving behaviour.

A Unity based simulation OgmaDrive was used to prototype the SDC, using C# scripts and Ogma Corp's EOgmaNeo library (that contains the SPH implementation).


Control of the uSDC

There are two Python3 scripts included in the GitHub repository to test the uSDC:

motorTest.py - A script to test the Explorer pHAT, Steam controller, and drive the uSDC around.
main.py - The main script containing using the EOgmaNeo library. Allowing for learning, inference, and self-driving on the RPi ZeroW.


Steam controller daemon

Both of the included python scripts require the Steam controller daemon to be running. After the daemon has been started, it emulates the Steam controller as a Xbox controller. PyGame can then connect to and obtain joystick and button information. The motorTest.py and main.py scripts can then make use of the controller to drive the uSDC around.

Starting the daemon requires the following bash command:

sudo python3 ~/steamcontroller/scripts/sc-xbox.py start

and the following command to stop the daemon:

sudo python3 ~/steamcontroller/scripts/sc-xbox.py stop


The motorTest python script

To test the Steam controller and Explorer pHAT motor driver, the following commands and script can be used:

cd ~/EOgmaDrive/Configuration3
sudo python3 ~/steamcontroller/scripts/sc-xbox.py start
sudo python3 motorTest.py
sudo python3 ~/steamcontroller/scripts/sc-xbox.py stop

Similar controls as used in the main.py script are used here:

  • The joystick controls left or right steering,
  • Trigger buttons apply forward/backward drive,
  • The `A` button cleanly exits the script.

The `trimming` global variable allows for trimming of the forward/backward motion, so that applying motion using the triggers ensures that the uSDC travels in a straight line.


The main python script

The `main.py` Python3 script is used for training (manual driving) and inference (self driving). All processing occurs on the Raspberry Pi ZeroW board.

It can be started using the following bash commands:

cd ~/EOgmaDrive/Configuration3
sudo python3 ~/steamcontroller/scripts/sc-xbox.py start
sudo python3 main.py
sudo python3 ~/steamcontroller/scripts/sc-xbox.py stop

The uSDC can then be controlled using the Steam controller with:

  • The analogue joystick controls left or right steering.
  • Trigger buttons apply forward/backward drive.
  • The `A` button toggles between training and prediction modes.
  • The `B` button saves the current camera image.
  • The `X` button saves the current state of the hierarchy.
  • The `Y` button exits the script.

If you train and save out the current state of the hierarchy, that saved state can be reloaded by starting the script with a `load` parameter. For example:

sudo python3 main.py load

Note: Saving the hierarchy to the a file on the SD card can take a minute or so to perform. Console text will announce when saving starts, and also when saving has completed.

If the uSDC doesn't travel in a straight line using one of the trigger buttons, a `trimming` global variable can be modified to compensate for any drift. For example this could be set to `trimming = 0.2`

An `RGB` image is captured from the camera module at each time step. This is converted into a grey scale...

Read more »

View all 20 components

  • 1
    Step 1 - Acquiring Parts and 3D Printing


    The above image ties in with the Bill Of Materials (BOM) found here: 
    https://github.com/ogmacorp/EOgmaDrive/blob/master/Configuration3/BillOfMaterials.md

    A 3D printer, or printing service, can be used to make the main base and mounting parts. All other items should be readily available from online suppliers.

    The 3D Printing service we used created a case study for our print work, that can be found here: https://www.3dprint-uk.co.uk/portfolio-item/ogma-corp-richard-crowder-self-drive-car/

  • 2
    Step 2 - Mounting front ball caster, motors, and battery


    • Glue the wheels to the Pololu motors. Take care not to get any glue into the black motor gearbox casing.
    • Slot the motors into the base. Feed the motor cables through the hole in the base.
    • Bolt the Pololu ball caster to the base. Use the larger spacing plate supplied with the ball caster.
    • Slide the Adafruit LiPo battery into the base.
  • 3
    Step 3 - Solder headers to the RPi ZeroW and Explorer pHAT


    • Solder GPIO headers onto the Raspberry Pi ZeroW (male header) and Explorer pHAT (female header).
    • Stick a heat sink onto the Pi ZeroW Broadcom CPU.

View all 10 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates