Close
0%
0%

GUILLENGAP Cobot

Collaborative Robot Arm with Artificial Intelligence, and Used as Assistant in: 1) Voice Command System & 2) Sorting Objects System

Public Chat
Similar projects worth following
This design utilizes robotics in a unique way, as a personal assistant! This is a collaborative robot arm in three versions to assist you in several tasks.

VERSION 3...
Finally my third version. Here I'm sorting objects by color and using TensorFlow with Arduino boards, this system detects blue, green and red balls to place them in their corresponding container.

VERSION 2...
Here I´m adding more functions to my first version: a) controlling a bottle, b) controlling a bradboard, and c) controlling a box. Here I continue using AI, but with different hardware (Alexa Echo Dot, ESP32-WROOM-32, and Arduino Nano 33 BLE Sense) and free open source tools (FauxmoESP library).

VERSION 1...
Robot arm that pick-and-place an object like a box. Here I'm using the "MATRIX Creator" and Raspberry Pi boards, to control a robotic arm by mean of Artificial Intelligence (AI) with voice commands and the "Snips" assistant.

Hi everyone! Below I show you summaries of my Cobot in its three versions:

Version 3...

A video summarizing the journey of the third version. Here I´m going to sort three objects by color and using TensorFlow

Version 2...

A video summarizing the journey of this project in its second version. Here I'm using different hardware and software, and I´m showing three examples:  1) Controlling a Bottle, 2) Controlling a Breadboard, and 3) Controlling a Box.


Version 1...

A video summarizing the journey of this project in the first version. Main goal, Features, Solutions, Content, Hardware and software prerequisites, 


guillengap_ver1.zip

First code version of the robot arm

x-zip-compressed - 2.66 kB - 06/27/2021 at 23:09

Download

guillengap_ver2.zip

Second code version of the robot arm

x-zip-compressed - 4.30 kB - 08/10/2021 at 02:09

Download

guillengap_ver3.zip

Third code version of the robot arm

x-zip-compressed - 11.51 kB - 09/08/2021 at 03:02

Download

circuit_diagram.zip

Circuit diagram of the robot arm, version 1

x-zip-compressed - 295.79 kB - 06/28/2021 at 01:11

Download

circuit_diagram_v2.zip

Circuit diagram of the robot arm, version 2

x-zip-compressed - 1.09 MB - 08/17/2021 at 23:31

Download

View all 8 files

  • 1 × Raspberry Pi 3 Model B+
  • 1 × MATRIX Labs MATRIX Creator
  • 1 × Alexa Echo Dot (3rd Gen)
  • 1 × ESP32-WROOM-32
  • 1 × Arduino Nano 33 BLE Sense

View all 13 components

  • Step 12. GUILLENGAP Robot Arm 3rd Version - Sorting Objects by Color Using TensorFlow

    Guillermo Perez Guillen09/08/2021 at 01:10 0 comments

    this is the third version of my project "GUILLENGAP Robot" Arm. Here, I´m going to sort objects by color using TensorFlow.

    HARDWARE

    The schematic diagram below shows the electrical connections of the components.

    How does it work?

    1. The Arduino Nano 33 BLE SENSE board first detects a ball that is less than 3 cm away, then the color sensor captures the color data of this ball and through tensor flow the object color is calculated with a prediction above 98 %. Likewise, an RGB led lights up according to the color of the ball.
    2. Finally, the Arduino Nano 33 IoT board is notified of the color of the detected ball, the robotic arm holds it and places it in the corresponding color container.

    SOFTWARE:

    1. For the identification of the three blue, green and red balls I followed all the steps of this tutorial published by the official Arduino site: Fruit identification using Arduino and TensorFlow
    2. I got my three files in csv format: blue.csv, green.csv and red.csv
    3. Subsequently, I opened the FruitToEmoji Jupyter Notebook in colab
    4. I followed the instructions for applying Tensor Flow.
    5. Finally I got and downloaded my "model.h" file
    6. I installed the project "object_color_classifier.ino" on my "Arduino NANO 33 BLE Sense" board, and you can download it on my Github repository
    7. The code "object_color_robot_arm" I uploaded it to my "Arduino NANO 33 IoT" board, and also you can download it on my Github repository

    IMAGES

    Below I show images of the assembling the prototype and the test:

    Here we can see the Arduino NANO 33 BLE Sense board and the wooden platform with a green ball

    Detecting the red ball

    Moving the red ball

    Placing the red ball inside the container

    Test where we can see the percentage of prediction

    TEST 

    In the video below I show you the test with high percentage of prediction.

    CONCLUSION:

    • This system sortering objects by color worked satisfactorily, and I got prediction percentages above 98% and without any error.
    • Finally I got my goal, since this project has many applications for working at home. Eg: 1) It can be used to sorter mechanical parts in a workshop, 2) as a sorter of fruits or vegetables; 3) as a sorter of colored rubber balls, toys, etc.

  • Step 11. GUILLENGAP Robot Arm 2nd Version - Test

    Guillermo Perez Guillen07/30/2021 at 20:30 0 comments

    TESTS

    In the video below I show you the tests with a water bottle.

    Now in the video below, the robot arm helps me unsolder a breadboard´s wire.

    Finally, and as in the first version, I order the robot arm to move a box containing a lamp.

    CONCLUSION 

    This was a good experience as I used a free tool to control my robot arm with the Alexa Echo Dot device. Also increase the functions without problems. The results were satisfactory and you can see them in the videos above.

  • Step 10. GUILLENGAP Robot Arm 2nd Version - HW & SW

    Guillermo Perez Guillen07/29/2021 at 19:40 0 comments

    As part of the continuous improvements, in this prototype I experimented with different hardware and software and using free open source tools as shown below.

    HARDWARE

    The schematic diagram below shows the electrical connections of the electronic components.

    How does it work?

    1. We use voice commands to perform three functions: 1) controlling a bottle; 2) controlling a plate; and 3) controlling a box.
    2. The voice commands are activated through Alexa Echo Dot, and they reach the ESP32-WROOM-32 board via WiFi.
    3. Finally, these voice commands are transmitted to the Arduino Nano 33 BLE Sense board, which controls the robot arm.

    SOFTWARE

    The FauxmoESP

    To control the ESP32 with Alexa Echo Dot, you need to install the FauxmoESP library. This library emulates a Belkin Wemo device, allowing you to control your ESP32 using this protocol. This way, the Echo Dot instantly recognizes the device, after uploading the code, without any extra skills or third party services. You can read more about FauxmoESP here.

    Installing the ESP32 Board in Arduino IDE

    In order to upload code to your ESP32 using Arduino IDE, you should install an add-on for the Arduino IDE that allows you to program the ESP32 using the Arduino IDE and its programming language. You can read more about Installing the ESP32 Board in Arduino IDE here.

    Installing the AsyncTCP Library

    You also need to install the AsyncTCP Library. You can read more about Installing the AsyncTCP Library here.

    Codes

    You can get the codes on my Github repository: guillengap-robot-arm

  • Step 9. Test and Conclusion

    Guillermo Perez Guillen06/28/2021 at 02:33 0 comments

    TEST

    Before we start to do tests, we need to do next actions:

    On your Raspberry Pi type node assistant.js

    On your computer's terminal type sam status and sam watch

    Congrats! now you can test your robotic arm

    CONCLUSION

    The prototype worked very well, and is very useful to develop more complex actions with the robotic arm. however, I also had some difficulties such as:

    • The board worked very well with 5 servos and when I used 6 servos the system was collapsed and the robotic arm made strange movements. Also in the servomotor configuration I changed to the value of: min_pulse_ms: 0.2 and the robotic arm worked much better.
    • As I explained at the beginning of this tutorial, I had problems configuring the audio board, but this was successfully solved and now the MATRIX Creator board is very efficient even when it receives keywords at half a meter.

    It's easy to say that in the future this prototype can serve as a model in every home, and may be, we will have a robotic arm at our tables to help us in multiple tasks.

  • Step 8. Printing and Assembling the Panel

    Guillermo Perez Guillen06/28/2021 at 02:27 0 comments

    I have designed a 10 x 10 cm panel for three functions:

    • The first is to hold the cables of the servos and that these don't get disconnected.
    • The second is to hold the connector of all physical grounds of the servos.
    • The third is to hold the connector of all Vcc connections of the servos.

    Below you can see a picture and download the STL file on my GitHub repository here:  https://github.com/guillengap/guillengap-robot-arm

  • Step 7. Wiring the Robot Arm

    Guillermo Perez Guillen06/28/2021 at 02:17 0 comments

    Below you can see the electrical diagram with the connections of this project, it's recommended to do it patiently so as not to make mistakes because there are many cables to be connected.

    I suggest you not wire the servo 5, since in the tests I carried out the system collapsed when I connected the six servos. Then I decided to disconnect the servomotor that was less used.

  • Step 6. Creating assistant.js

    Guillermo Perez Guillen06/28/2021 at 02:15 0 comments

    This step will go show you how to setup a MATRIX Lite project with Snips.

    First, MQTT must be installed and used to listen in on events from Snips.

    npm install mqtt --save

    assistant.js will be used to listen and respond to events from your Snips assistant. This file combines all the code from everloop.js & relay.js to control the everloop and relay switch through voice.

    ///////////////////////////////////
    //AUTHOR: GUILLERMO PEREZ GUILLEN//
    ///////////////////////////////////
    
    /////////////
    //VARIABLES//
    /////////////
    var mqtt = require('mqtt');
    var client  = mqtt.connect('mqtt://localhost', { port: 1883 });
    var relay = require('./relay.js');
    var everloop = require('./everloop.js');
    var snipsUserName = 'guillengap';
    var wakeword = 'hermes/hotword/default/detected';
    var sessionEnd = 'hermes/dialogueManager/sessionEnded';
    var lightState = 'hermes/intent/'+snipsUserName+':roboticArmState';
    
    //////////////
    //ON CONNECT//
    //////////////
    client.on('connect', function() {
       console.log('Connected to Snips MQTT server\n');
       client.subscribe('hermes/hotword/default/detected');
       client.subscribe('hermes/dialogueManager/sessionEnded');
       client.subscribe(lightState);
    });
    //////////////
    //ON MESSAGE//
    //////////////
    client.on('message', function(topic,message) {
       var message = JSON.parse(message);
       switch(topic) {
           // * On Wakeword
           case wakeword:
               everloop.startWaiting();
               console.log('Wakeword Detected');
           break;
           // * Robotic Arm Change
           case lightState:
               // Turn robotic arm Here/There
               try{
                   if (message.slots[0].rawValue === 'here'){
                       relay.roboticarmHere();
                       everloop.stopWaiting();
                       console.log('Robotic Arm Here');
                   }
                   else if(message.slots[0].rawValue === 'there'){
                       relay.roboticarmThere();
                       everloop.stopWaiting();
                       console.log('Robotic Arm There');
                   }
               }
               // Expect error if `here` or `there` is not heard
               catch(e){
                   console.log('Did not receive an Here/There state')
               }
           break;
           // * On Conversation End
           case sessionEnd:
               everloop.stopWaiting();
               console.log('Session Ended\n');
           break;
       }
    });
    

     everloop.js incorporates the code required to control the LEDs on the MATRIX Creator.

    ///////////////////////////////////
    //AUTHOR: GUILLERMO PEREZ GUILLEN//
    ///////////////////////////////////
    
    /////////////
    //VARIABLES//
    /////////////
    const matrix = require('@matrix-io/matrix-lite');
    var matrix_device_leds = 0;// Holds amount of LEDs on MATRIX device
    var methods = {};// Declaration of method controls at the end
    var waitingToggle = false;
    var counter = 0;
    
    setInterval(function(){
       // Turns off all LEDs
       if (waitingToggle == false) {
               // Set individual LED value
               matrix.led.set({});
       }
       // Creates pulsing LED effect
       else if (waitingToggle == true) {
               // Set individual LED value
               matrix.led.set({
                   b: (Math.round((Math.sin(counter) + 1) * 100) + 10),// Math used to make pulsing effect
               });
           
       };
       counter = counter + 0.2;
       // Store the Everloop image in MATRIX configuration
    },50);
    ///////////////////
    //WAITING METHODS//
    ///////////////////
    methods.startWaiting = function() {
       waitingToggle = true;
    };
    methods.stopWaiting = function() {
       waitingToggle = false;
    };
    module.exports = methods;// Export methods in order to make them avaialble to other files 

    relay.js includes the code required to turn the robotic arm here and there. This relay will control power to the robotic arm.

    ///////////////////////////////////
    //AUTHOR: GUILLERMO PEREZ GUILLEN//
    ///////////////////////////////////
    
    /////////////
    //VARIABLES//
    /////////////
    const matrix = require('@matrix-io/matrix-lite');
    var methods = {};// Declaration of method controls at the end
    
    matrix.gpio.setFunction(0, 'PWM');
    matrix.gpio.setFunction(1, 'PWM');
    matrix.gpio.setFunction(2, 'PWM');
    matrix.gpio.setFunction(3, 'PWM');
    matrix.gpio.setFunction(4, 'PWM');
    matrix.gpio.setFunction(5, 'PWM');
    
    matrix.gpio.setMode(0,'output');
    matrix.gpio.setMode(1,'output');
    matrix.gpio.setMode(2,'output');
    matrix.gpio.setMode(3,'output');
    matrix.gpio.setMode(...
    Read more »

  • Step 5. Deploy your Assistant

    Guillermo Perez Guillen06/28/2021 at 02:07 0 comments

    You can now deploy your assistant! On the bottom right of the page will be a Deploy Assistant button.

    Use the Sam CLI Tool to deploy the assistant to your Raspberry Pi. Below is an example of the command to use.

  • Step 4. Adding an Intent

    Guillermo Perez Guillen06/28/2021 at 02:02 0 comments

    Intents are what Snips uses to handle user requests. For this Assistant, we'll start by creating an intent for turning Here/There the MATRIX Creator's App.

    Create a new intent roboticArmState for your RoboticArm app

    Once you've defined your intent, you need to extract whether the user wants their robotic arm here or there. This is where slots come in.

    Create a new slot called power and a custom slot type with the same name.

    Make sure your power slot type has here & there as values.

    Create example sentences containing the words here and there. Highlight these words in your example sentences to assign them to your recently created power slot.

    Be sure to save!

  • Step 3. Creating a Snips Assistant and App

    Guillermo Perez Guillen06/28/2021 at 01:59 0 comments

    Sign into your Snips.ai account.

    Create an assistant called Robotic Arm

    Create an App named RoboticArm

View all 12 project logs

  • 1
    Steps

    Follow below steps to repeat the experience of this project

    • Step 1. Let's Get Started
    • Step 2. Connect Sam to Your Raspberry Pi
    • Step 3. Creating a Snips Assistant and App
    • Step 4. Adding an Intent
    • Step 5. Deploy your Assistant
    • Step 6. Creating assistant.js
    • Step 7. Wiring the Robotic Arm
    • Step 8. Printing and Assembling the Panel
    • Step 9. Test and Conclusion
    • Step 10. GUILLENGAP Robot Arm 2nd Version - HW & SW
    • Step 11. GUILLENGAP Robot Arm 2nd Version - Test
    • Step 12. GUILLENGAP Robot Arm 3rd Version - Sorting Objects by Color Using TensorFlow

    NOTE: All these steps are detailed in the Logs of this project

View all instructions

Enjoy this project?

Share

Discussions

Guillermo Perez Guillen wrote 07/19/2021 at 04:14 point

This project also participates in the "Supplyframe DesignLab: 2021 Hackaday Prize" contest, in the category of "Challenge 2: Refresh Work-From-Home Life".

  Are you sure? yes | no

Guillermo Perez Guillen wrote 06/28/2021 at 03:00 point

This project participates in the "Supplyframe DesignLab: 2021 Hackaday Prize" contest, in the category of "Challenge 4: Redefine Robots" and the updates will be published before the deadline.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates