Close
0%
0%

Dobby - DIY Voice Controlled Personal Assistant

Voice Controlled Personal Assistant robot to help you stay lazy using Raspberry Pi and Amazon Alexa

Similar projects worth following

Voice controlled personal assistants have made their way into our daily lives through software embedded in today's smartphones. Siri, Google Voice, Cortana, Alexa are few recognizable names.

Project Dobby extends the benefits of voice recognition and speech processing to robotics. Dobby follows a user's voice commands and can carry out per-programmed physical actions. It can be a waiter, a warehouse assistant or a nurse. At home, it can get you coffee from the kitchen, get your morning newspaper from the front door or take a selfie when you are partying with friends. The hardware and software can easily be adapted to any new environment.

In the future we plan to extent human following capabilities in Dobby so that it can serve as an assistant in an unknown environment. It can carry luggage on airports and supermarkets, ferry people in wheelchairs and kids in prams, all without human-effort and while avoiding obstacles.

IMG_20161109_005458.jpg

Representatory Schematic of the Dobby Circuit

JPEG Image - 389.17 kB - 11/08/2016 at 19:39

Preview

  • 1 × Raspberry Pi 3B Raspberry Pi
  • 1 × Amazon Tap Alexa enabled Voice Controlled Speaker
  • 2 × DC Motor Driver 20A, 6-24V load
  • 4 × High Torque DC Geared Motor 30 Kg-cm - 300 RPM - 7.5A Max Load Current
  • 10 × Female to Male Jumper wires

View all 11 components

  • Day 2 : First Steps for Human Follower

    Aditya Agarwal11/08/2016 at 16:52 0 comments

    Developing an autonomous human follower that doesn't rely on additional sensors or an environment map is a problem of relative localization. The position of user relative to Dobby needs to be tracked and known at all times to allow Dobby to follow the user without requiring an environment map. Further the user's smartphone needs to communicate this data to Dobby via Bluetooth.

    We have started work on Bluetooth communication between Dobby and user on GATT protocol. Currently we have experimented with following libraries but haven't found a reliable one :

    We have also developed an Android app for user's smartphone that can track his location by using the phone's magnetometer as a compass and by using dead-reckoning to get user's displacement and speed. Dead-reckoning relies on natural bounce of human body which can be measured using phone's accelerometer. Peaks in this data give information about user's speed and displacement. We are relying on a previously published work done by us on human gait analysis.


  • Day 1 : Indoor Localization of Dobby

    Aditya Agarwal11/08/2016 at 16:38 0 comments

    Since our localization idea to use Estimote for indoor localization of Dobby did not work, we are currently testing out a compass based approach. A simple compass app can be made using a magnetometer and along with information about Dobby's speed of motion, it can be used to localize Dobby's position in an indoor environment. We have followed this tutorial at Instructables to integrate a simple digital compass to our Raspberry Pi.

  • Day 0 : Hackathon Idea

    Dementor10/22/2016 at 17:08 0 comments

    The project idea started with the Sequoia Hack 2016 at Bangalore, India where we decided to use two the devices they were providing -Amazon Tap and Estimotes to make an indoor navigable voice controlled robot - Dobby.

    The plan started as using Estimotes for indoor-localisation and using Alexa API to process and voice-control to map commands to intents. Once we setup the hardware, we created the Lambda top map the commands to intents and the next challenge was to process the commands and send them to the raspberry-pi on-board to execute them. For the purpose of solving the prototype in 24 hours, we found out about adafruit-io and IFTT triggers which helped us easily convert voice commands to GPIO signals.

    Next we spent hours trying to test Estimote during the hackathon for localisation purposes but the data from Estimote BLEs were horrible and highly uncertain which despite 5 hours of trying to hack up a localisation solution using estimotes, we had to give up on it.

    We concluded the Hackathon with a voice controlled robot.

    Next, we created the Github project and added Issues to it in order to work to make it a complete indoor-assistant robot.

    More logs as we work more on it!

    P.S. Dobby also took many Selfies during the Hackathon.

View all 3 project logs

  • 1
    Step 1

    The first task is to setup Alexa communication with an online service. Here we use Alexa along with IFTTT and Adafruit. Amazon Alexa converts speech to text which is sent to IFFFT. IFTTT then sends different inputs to Adafruit feeds depending on speech texts.

    1) Create an Amazon Account and download the Amazon Alex App on your Smartphone. Sign in to the app using your account and setup the Amazon Tap by following the on-screen instructions

    2) Create an IFTTT account.

    3) Create an account at Adafruit.

    4) Create your first recipe at IFTTT my going to Create Recipe

    5) Choose Amazon Alexa as the trigger channel. IFTTT will ask you to authorize using an Amazon Account. Use account created in 1) to do so :

    6) Select "Say a specific phrase" in the next step. When asked for a trigger text string in Step 3, use "go straight". This is the phrase that will be spoken to Alexa when expecting the robot to "go forward". The full phrase will be "Alex trigger go forward". Alex trigger needs to be added to tell Alex to process an IFTTT trigger.

    7) Next step is setup Adafruit to receive data from IFTTT. Open a new Browser tab and Login to the Adafruit account created in 3). Dont close IFTTT as the setup is not yet complete.

    8) Create a Feed on Adafruit by going to 'Your Feeds' option on the left side-bar. We will name it "Dobby"


    9) Now we will create a Dashboard for debugging purposes to see if correct data is being published to feeds through IFTTT. Create a Dashboard by going to "Your Dashboards" in the left hand-side menu. After creating the Dashboard we will add 2 "Blocks" to it. One will be for seeing data coming in from IFTTT while the other will be for manually sending IFTTT data to a feed to debug if data is being sent correctly to the robot from the feed.

    10) Create 2 blocks - A slider (sending data to feed) and a stream (receiving IFTTT data) and select the feed created in 8)

    11) We are now done with the Adafruit Setup. Go back to IFTTT and then Choose an action by selecting Adafruit. Use the account created in step 3) to authorize. In the next step, select "Send data to Adafruit I/O".

    12) Now select the Adafruit Feed we created earlier. Let us the data to save to the feed as the number "1". This means that whenever we say "Alexa trigger go forward", data of 1 will be saved to our Adafruit feed.

  • 2
    Step 2

    Next step is to subscribe to Adafruit feed on Raspberry-Pi. Adafruit API provides a python API which using the `access_key` provides access to the feeds created by us.

    Refer to https://github.com/adafruit/io-client-python for the API on how to access data on feeds and subscribe and publish to them.

    Using the API, we subscribe to the feed and control the GPIO outputs to the motors to control the robot. Refer to `dobby.py` in the Github repository.

  • 3
    Step 3

    Last step is the GPIO control on the Pi which is used to control motors via a DC motor driver. We are going to use a simple differential drive control for the motors.



    Make the connections between the motors and motor driver as shown in the diagram.



    The motor-drivers that were used need 3 inputs to control the motors - Break, Direction and PWM. The PWM pin was given static PWM value to reduce the speed. Giving a high input on Break pin stops the motor while toggling Direction between 1 and 0 changes motor rotation direction from clockwise to anti-clockwise

    GPIO pins of Raspberry Pi were set depending on state to make the robot execute the corresponding action. Refer to https://sourceforge.net/p/raspberry-gpio-python/wiki/BasicUsage/ for GPIO control using python on Raspberry-pi.

View all 4 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates