Close
0%
0%

DIY hobby servos quadruped robot

Cheap 3D printed 4 legged robot, that almost looks like boston dynamics spot but moves like a newborn.

Public Chat
Similar projects worth following
Being a undergradute in physics and bored of solving paper problems, i decided to solve and apply a real world problems so i started this robotics project in order to introduce my self to control theory, studying its maths and practicing pragramming.
It runs with Raspberry Pi 4 as brain, plus an Arduino Mega for reading/writing of signals of 12 hobby servos and IMU, controlled by a PS3 Dualshock.

General Summary

The code which is mostly written by me, using python3 along with some typical libraries for the task, for now implements a walking loop inside a kinematic model, so you can command a pose to the body, as well as feet position and also velocity and directinal commands for the walking loop. Controlled by a ps3 controller via bluetooth.

Arduino is used for speed up the reading and writing of signals, so in the future i can add other sensor, as well as position feedback...

Lets see what this under 500$ robot can do:

Code of the robot


Project log index (updated):

  1. Model and code explanation.
    1. The kinematic model.
    2. Step trajectory and Gait planner.
    3. Arduino code  and serial communication.
    4. Raspberry Pi communication interfaces.
    5. Calibrating the servos.
    6. PyBullet Simulation example.
    7. Future 3DoF foot sensor.
  2. Robot experiments.
    1. A simple walking method.
    2. Real robot vs PyBullet simulation.
    3. Hitting the robot.
    4. Quadruped robot meets my cat.
    5. A more stable walking.


Why building a quadruped robot?

Appart from the interesting control problems that this robot involves, there are lot of interesting tasks this it can carry out and they are beginning to be demonstrated, as we have seen with Spot robot in the 2020 pandemic:

It's obious that there is still lot of work to do, but we are at the time where we can built one of these in our homes and have a very good aproximation of it.

There is a key factor in this robot, it doesn't need roads in orther to operate, so it can reach places that are very hard for a wheeled machine, so these 4-legged robot are ideal for tasks such as surveillance, recognition of areas not passable by vehicles and even rescues in areas with danger of collapse.

Finally, it is clear for now, that my robot isn't able to do those tasks, but for now i am satisfied that the robot walks stable and who knows if in the future it will be able to bring me a beer from the fridge, just by telling to it: 'dog bring me a beer!'

What problems will you face with this robot?

These legged robot have always amazed me, but it was hard to find easy to understand research, as well as a non-trivial mecanical designs plus complex maths. This makes the project hard to achive good results with DIY resources. So i rolled up my slevees and started reading papers from different investigation groups in the subject.

What my project apport to the maker community?

As this project is multidiciplinary, appart from designing my own version of the robot, i focused on its maths, solving different problems related with the movement of the robot, making a very looking forward model which only implements the maths necessary to run the robot, showing how every equation is related with the real robot and givin a general view of it.

This is not the most robust way of building a model, as in real world robots there are lot of additional maths methods in order to pulish the behaviour of the robot, for example, adding interpolation methods would make the robot performs smoother or building an state stimator + filter (kalman state stimator) would let you do very preccise meassurements on the robot and make its movements even more natural.


Visual scheme of electronics and model.

As you can see in the scheme, in this setup i use one bulk converter por 3 servos, with 4 bulk converters in total. This is because each servo drains up to 4A peak.

NOTE: This would not be a rigorous way of wiring, it is just a drawing.

Licensing

This project uses the GPL v3 license for all software related.

This project uses the Attribution-NonCommercial-ShareAlike 4.0 International for all hardware components.

---------- more ----------

Robot Update (AUG 2020)

Great news to the project, as it has been one of the finalist inside the WildCard challenge of the HACKADAY PRIZE 2020.

So this moth will come full of new updates along with a very nice video for my contest submition.

Robot Update (JUL 2020)

  • Finally...
Read more »

x-zip-compressed - 2.35 MB - 08/12/2020 at 18:25

Download

x-zip-compressed - 5.94 MB - 08/12/2020 at 18:36

Download

x-zip-compressed - 8.55 MB - 08/12/2020 at 18:36

Download

Adobe Portable Document Format - 82.65 kB - 07/07/2020 at 00:34

Preview
Download

Standard Tesselated Geometry - 605.26 kB - 08/12/2020 at 18:25

Download

View all 28 files

  • 12 × Servo DS3218 PRO version. 20 kg/cm torque and 0.1 s/60º.
  • 12 × Metal horns for the servos. Plastic one are not a choose, as they generate play.
  • 3 × 2x 18650 battery holders in serie (7,4 V)
  • 20 × 8x3x4 mm bearings (autter x inner x width)
  • 1 × MPU6050 IMU. accelerometer + gyrocope.

View all 11 components

  • Bill of materials and price of the robot.

    Miguel Ayuso Parrilla4 days ago 0 comments

    To complete the Component Section, in this log i want to show the complete bill of materials, with the price of one unit of the robot, i have to say i'm surprised how cheap it is to build just one robot.

    In my case, this development has been more expensive as i have made 3 different prototypes with its resvective components, so more or less i have spent two times the price of one unit.

    Almost all componet were bought from the typical china online wareshops.

  • PyBullet Simulation example

    Miguel Ayuso Parrilla09/15/2020 at 13:06 0 comments

    This log i'm going to talk about the simulation used for the robot development, this was mostly used for debugging the maths inside the the different models.

    The code of the simulation is no longer implemented in the Raspberry code, as it can't support GUI mode and i'm not taking data from PyBullet environment. The code i am referring to is at the V1_PCsimulation branch of the robot GitHub repository.


    This code can be ran with the robot just plugging the arduino to the PC via USB and run the robot_walking.py file, all requirements at README.md

    But if you don't have the robot built, you can run just the simulation a play along with the robot inside the PyBullet environment. Also feel free to modify it and improve it.

    You just need PyBullet and Numpy on Python 3, then download the V1_PCsimulation branch and run walking_simulation_example.py.

    More info of how the simulation works shown in the next video:


    Now, i'm going to breightly explain the main code of the simulation, here is shown:

    """
    Created on Sun Mar 15 19:51:40 2020
    @author: linux-asd
    """
    
    import pybullet as p
    import numpy as np
    import time
    import pybullet_data
    from pybullet_debuger import pybulletDebug  
    from kinematic_model import robotKinematics
    from gaitPlanner import trotGait
    
    """Connect to rhe physics client, you can choose GUI or DIRECT mode"""
    physicsClient = p.connect(p.GUI)#or p.DIRECT for non-graphical version
    p.setAdditionalSearchPath(pybullet_data.getDataPath()) #optionally
    """set Gravity"""
    p.setGravity(0,0,-9.8)
    
    """import the robot's URDF file, if it is fixed no ground plane is needed""" 
    cubeStartPos = [0,0,0.2]
    FixedBase = False #if fixed no plane is imported
    if (FixedBase == False):
        p.loadURDF("plane.urdf")
    boxId = p.loadURDF("4leggedRobot.urdf",cubeStartPos, useFixedBase=FixedBase)
    
    jointIds = []
    paramIds = [] 
    time.sleep(0.5)
    """get info along all the joints for debugging if needed"""
    for j in range(p.getNumJoints(boxId)):
    #    p.changeDynamics(boxId, j, linearDamping=0, angularDamping=0)
        info = p.getJointInfo(boxId, j)
        print(info)
        jointName = info[1]
        jointType = info[2]
        jointIds.append(j)
        
    footFR_index = 3
    footFL_index = 7
    footBR_index = 11
    footBL_index = 15   
    
    """init the classes used"""
    pybulletDebug = pybulletDebug()
    robotKinematics = robotKinematics()
    trot = trotGait() 
    
    #robot properties
    """initial foot position"""
    #foot separation (Ydist = 0.16 -> tetta=0) and distance to floor
    Xdist = 0.20
    Ydist = 0.15
    height = 0.15
    """This matrix will define where the feet are going to be at its standing position
    with respect the center of the robot"""
    #body frame to foot frame vector
    bodytoFeet0 = np.matrix([[ Xdist/2 , -Ydist/2 , -height],
                             [ Xdist/2 ,  Ydist/2 , -height],
                             [-Xdist/2 , -Ydist/2 , -height],
                             [-Xdist/2 ,  Ydist/2 , -height]])
    """Gait definition parameters, here you can set different period of step and
    the offset between every foot's loop, defining different gaits"""
    T = 0.5 #period of time (in seconds) of every step
    offset = np.array([0.5 , 0. , 0. , 0.5]) #defines the offset between each foot step in this order (FR,FL,BR,BL)
    
    """start a real time simulation, so no simulation steps are needed"""
    p.setRealTimeSimulation(1)
        
    while(True):
        lastTime = time.time()
    """show sliders in the GUI"""
        pos , orn , L , angle , Lrot , T = pybulletDebug.cam_and_robotstates(boxId)  
        #calculates the feet coord for gait, defining length of the step and direction (0º -> forward; 180º -> backward)
        bodytoFeet = trot.loop(L , angle , Lrot , T , offset , bodytoFeet0)
    
    #####################################################################################
    #####   kinematics Model: Input body orientation, deviation and foot position    ####
    #####   and get the angles, neccesary to reach that position, for every joint    ####
        FR_angles, FL_angles, BR_angles, BL_angles , transformedBodytoFeet = robotKinematics.solve(orn , pos , bodytoFeet)
         
    """move all joints at the next time step angle"""
        #move movable joints
        for i in range(0, footFR_index):
     p.setJointMotorControl2(boxId,...
    Read more »

  • Future 3DoF foot sensor.

    Miguel Ayuso Parrilla08/27/2020 at 20:52 0 comments

    In order to make the robot dynamic in the precisse meaning of the word, we need to have meassurements of the forces acting acting inside the robot, this can be either from the actuators, feet or the IMU.

    This way we can build a robust control algorithm taking into account these forces, such as the Inverse Pendulum model. Basically what we want in here is to implement a physics math model in stead of a direct model like the Kinematics model it implements now.

    This task is not easy to achive, but i found a paper with a very inteligent solution to this problem: Foot design for a hexapod walking robot.

    Basically it uses 3 sensitive resistance force sensor, with a proper orientation in order to read the forces at the foot in the 3 different directions.

    What i came with is shown in this pictures, but this is only a quick design with lot of errors at the time of reading accurate meassurements.


    Some actual works on the MIT cheetah show how usefull can be this type of foot sensor:

  • Calibrating the servos.

    Miguel Ayuso Parrilla08/01/2020 at 09:15 0 comments

    First of all, you will need to plug all servos to their proper pin, in my case here they are (you can change this for your how setup):

    void connectServos() {
                          //FR
      Servos[0].attach(40); //coxa
      Servos[1].attach(38); //femur
      Servos[2].attach(36); //tibia
                          //FL
      Servos[3].attach(42); //coxa
      Servos[4].attach(44); //femur
      Servos[5].attach(46); //tibia
                         //BR
      Servos[6].attach(34); //coxa
      Servos[7].attach(32); //femur
      Servos[8].attach(30); //tibia
                         //BL
      Servos[9].attach(48); //coxa
      Servos[10].attach(50); //femur
      Servos[11].attach(52); //tibia
    }

    As soon as the servos are corretly attached, what i made in order to put the servos on its correct position is just to dissasembly the legs, i mean, separate the servo output from the its horn, in order to let the servo move freely. This way you can power the servos, let them go to the zero position and the reassembly.

    I wrote a small sketch in order to put the servos on their zero position, you can find it at servo_calibration.py file, this way you can attatch the servo very close to their zero position.

    Then, a more precise calibration is needed, you will need to calibrate the offset of each servo, those offset are defined at angleToPulse.py file. This file only contains the line defining the pulse corresponding to each angle for every servo.

    Being a line in the form: y = Ax + B. The offset of each servo is defined by B, which mean the pulse at the zero angle.

    If you are using different servos, the slope of the line (A) will be different so you will need to identify it. What i did, is just to take 4-5 meassurement of pulse at different angles and put them on an excel where i calculate the adjucted line from those points.

    def convert(FR_angles, FL_angles, BR_angles, BL_angles):
        pulse = np.empty([12])
        #FR
        pulse[0] = int(-10.822 * np.rad2deg(-FR_angles[0])) + 950
        pulse[1] = int(-10.822 * np.rad2deg(FR_angles[1])) + 2280
        pulse[2] = int(10.822 * (np.rad2deg(FR_angles[2]) + 90)) + 1000
        #FL
        pulse[3] = int(10.822 * np.rad2deg(FL_angles[0])) + 1020 
        pulse[4] = int(10.822 * np.rad2deg(FL_angles[1])) + 570
        pulse[5] = int(-10.822 * (np.rad2deg(FL_angles[2]) + 90)) + 1150
        #BR
        pulse[6] = int(10.822 * np.rad2deg(-BR_angles[0])) + 1060 
        pulse[7] = int(-10.822 * np.rad2deg(BR_angles[1])) + 2335 
        pulse[8] = int(10.822 * (np.rad2deg(BR_angles[2]) + 90)) + 1200
        #BL
        pulse[9] = int(-10.822 * np.rad2deg(BL_angles[0])) + 890
        pulse[10] = int(10.822 * np.rad2deg(BL_angles[1])) + 710
        pulse[11] = int(-10.822 * (np.rad2deg(BL_angles[2]) + 90)) + 1050
        return pulse

  • A more stable walking.

    Miguel Ayuso Parrilla07/20/2020 at 11:49 0 comments

    This log is about polishing the gaitPlanner as you can see in the next video:


    What is interesting on this video is that no control algorithm is running, but it seems very stable.

    This is achived just by separating stance phase from the swing (as explained in the gaitPlanner.py log), this way i can define an offset between both phases, so running the stance phase at 3/4 of the hole step, makes the robot to hold its weigth on the 4 legs for a short period of time being a bit more stable, just by running the gaitPlanner loop.
    Visually the gait is defined like follows:

    Aditional info about that is that servos got very hot if it runs for more than 10 minutes.

  • Quadruped robot meets my cat.

    Miguel Ayuso Parrilla07/07/2020 at 18:29 0 comments

    As a quick update of the V2 robot performing i have introduced it to my cat.

    My cat is very introverted and usually doesn't like new things, in this case isn't different but it seems to cause she some curiosity.

    The robot performs well, as it uses the last version of the code, the extra weigth of the batteries and raspberry pi is not a problem as the new design is much more lighter and the servos are now powered at 6V in stead of 5V as in the first version.

    On the other hand, i have notice the model randomly gives bad solutions running on the raspberry pi, thats why sometimes you see the leg make a strange movement. I haven't identify the problem yet, maybe is my software problem or bufer overflow.

    There is no aim to hurt the animal, is my cat omg.

  • Real robot vs PyBullet simulation.

    Miguel Ayuso Parrilla06/26/2020 at 14:45 0 comments

    In this video i compare the real robot model with the simulation model:

    I'm using PyBullet physics engine, it is simple to use and seems to have a good performance as well as very precise results. In the video you can see that the simulation is a bit slow, this is becouse the simulation (at the time of the video) was not working at real time but computing the simulation every 20 ms.

  • Raspberry Pi communication interfaces.

    Miguel Ayuso Parrilla06/21/2020 at 21:41 0 comments

    This log is going to be about the python communication files, explaining the interfaces used in order to communicate both with the ps3 controller and with the arduino.

    Serial Bidirectional Communication on python.

    This is written in the serial_com.py file and the code goes as follow:

    • First thing is to define the serial port and its baudrate.
    import serial
    
    class ArduinoSerial: 
        def __init__(self , port):
            #ls -l /dev | grep ACM to identify serial port of the arduino
            self.arduino = serial.Serial(port, 500000, timeout = 1)
            self.arduino.setDTR(False) #force an arduino reset
            time.sleep(1)
            self.arduino.flushInput()
            self.arduino.setDTR(True)
    • Then the communication is done with the next two fuctions. This fuctions are very general and can be used without big chnges.

      This program will stop the main loop until the next data arrives, this way the arudino memory is not overloaded with the pulse commands arriving.

    def serialSend(self, pulse):  
            comando = "<{0}#{1}#{2}#{3}#{4}#{5}#{6}#{7}#{8}#{9}#{10}#{11}>" #Input
            command=comando.format(int(pulse[0]), int(pulse[1]), int(pulse[2]), 
                                       int(pulse[3]), int(pulse[4]), int(pulse[5]), 
                                       int(pulse[6]), int(pulse[7]), int(pulse[8]), 
                                       int(pulse[9]), int(pulse[10]), int(pulse[11]))
            self.arduino.write(bytes(command , encoding='utf8'))
    
        def serialRecive(self):
            try:
                startMarker = 60
                endMarker = 62
                getSerialValue = bytes()
                x = "z" # any value that is not an end- or startMarker
                byteCount = -1 # to allow for the fact that the last increment will be one too many
                # wait for the start character
                while  ord(x) != startMarker: 
                    x = self.arduino.read()
                    # save data until the end marker is found
                while ord(x) != endMarker:
                    if ord(x) != startMarker:
                        getSerialValue = getSerialValue + x 
                        byteCount += 1
                    x = self.arduino.read()
                loopTime , Xacc , Yacc , roll , pitch  = numpy.fromstring(getSerialValue.decode('ascii', errors='replace'), sep = '#' )  
    


    Connecting PS3 controller to the Raspberry Pi and reading its events.

    • Then events are read on the joystick.py file, using evdev python library. In this set up, the events are read every time the read() function is called, ignoring the others ongoing events. Also there isn't any queue method or filtering.
    from evdev import InputDevice, categorize, ecodes
    from select import select
    import numpy as np
    
    class Joystick:
        def __init__(self , event):
            #python3 /usr/local/lib/python3.8/dist-packages/evdev/evtest.py for identify event
            self.gamepad = InputDevice(event)
    
        def read(self):
            r,w,x = select([self.gamepad.fd], [], [], 0.)
            if r:
                for event in self.gamepad.read():
                    if event.type == ecodes.EV_KEY:
                        if event.value == 1:
                            if event.code == 544:#up arrow
                                self.T += 0.05
                            if event.code == 545:#down arrow
                                self.T -= 0.05
                            if event.code == 308:#square
                                if self.compliantMode == True:
                                    self.compliantMode = False
                                elif self.compliantMode == False:
                                    self.compliantMode = True    
                        else:
                            print("boton soltado")
    
                    elif event.type == ecodes.EV_ABS:
                        absevent = categorize(event)
                        if ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_X":  
                            self.L3[0]=absevent.event.value-127
                        elif ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_Y":
                            self.L3[1]=absevent.event.value-127
                        elif ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_RX":
                            self.R3[0]=absevent.event.value-127
                        elif ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_RY":
                            self.R3[1]=absevent.event.value-127
    

  • A simple walking method.

    Miguel Ayuso Parrilla06/15/2020 at 00:39 0 comments

    This experimet was the first one done on the robot, it consist on a very simple arduino sketch, implementing the IK equations and a simple creep gait loop with 4 points bezier curves.

    This only walks forward to the infinite, but is a useful example of the IK equations written on arduino language, here is the Arduino-quadruped-robot-loop repositorie from my github.

  • Role of Arduino and Serial Communication.

    Miguel Ayuso Parrilla06/12/2020 at 21:34 1 comment

    Working frequencies.

    Arduino is used as a pwm signal generator for the servos and software pwm doesn't really produce a pwm signal, the main loop must run at the duty frequency of the servos (in my case 50-300Hz, 20-4 ms loop). Of course, arduino is more powerful than that, so after sending a first pulse and before the next pulse is sent, arduino will read the different sensors, send their data and recive the next pulse command from the raspberry pi (via bidirectional serial commands).

    As the arduino loop is limited by the frequency of the servos, the main python code will also work at that frequency. As far as i know, this is because if the model gives a solution every 5ms and the actuators can only read a solution every 20ms, there will be 'useless' solutions given, resulting with the system not behaving as desired.

    Arduino sketch.

    The arduino code is very simple, but a bit long. Basically, it is composed by two files, main program: arduino_com.ino and the IMU functions: gy_521_send_serial.ino (Code by "Krodal"), this includes a low pass filter for the orientation. It works very well, without any fiffo problems but there is lot of drift on the yaw angle. Which is normal for these gyro+acc sensors.

    About the main code:

    void loop() {
      unsigned long currentMillis = millis();
        if (currentMillis - previousMillis >= interval) {
          previousMillis = currentMillis;
    
          readAngles();          
          recvWithStartEndMarkers();
          newData = false;
          moveServos(pulse0, pulse1, pulse2, pulse3, pulse4, pulse5, pulse6, pulse7, pulse8, pulse9, pulse10, pulse11);
      }
    }

     Is basically what i exaplined, it only works when the interval time has passed and then it reads the angles from IMU, read the incoming pulses via serial command, switch newData flag off and write the new pulses command.

    The pulses are given on microseconds, the translation of angles to microseconds is done in the python code.

    recvWithStartEndMarkers().

    How commands are read and then converted to an integer is made with this function, which is from: Serial Input Basics - updated.

    Here there is the example i used:

    // Example 3 - Receive with start- and end-markers
    
    const byte numChars = 32;
    char receivedChars[numChars];
    
    boolean newData = false;
    
    void setup() {
        Serial.begin(9600);
        Serial.println("<Arduino is ready>");
    }
    
    void loop() {
        recvWithStartEndMarkers();
        showNewData();
    }
    
    void recvWithStartEndMarkers() {
        static boolean recvInProgress = false;
        static byte ndx = 0;
        char startMarker = '<';
        char endMarker = '>';
        char rc;
     
        while (Serial.available() > 0 && newData == false) {
            rc = Serial.read();
    
            if (recvInProgress == true) {
                if (rc != endMarker) {
                    receivedChars[ndx] = rc;
                    ndx++;
                    if (ndx >= numChars) {
                        ndx = numChars - 1;
                    }
                }
                else {
                    receivedChars[ndx] = '\0'; // terminate the string
                    recvInProgress = false;
                    ndx = 0;
                    newData = true;
                }
            }
    
            else if (rc == startMarker) {
                recvInProgress = true;
            }
        }
    }

     What i have done is to add an spacer ('#') to differentiate each pulse and a counter, that is in charge of saving the numeric string to its correspoding integer pulse.

View all 13 project logs

  • 1
    Building premises.

    I'm going to enumerate some important concepts i have taken into account for the robot design.

    • This robots depends hardly on its mass distribution and dynamics of the moving parts (legs). Thats why i decided to go for this servo configuration, having the coxa servo on the body, the femur servo on the coxa and tibia servo on the the femur, so there is only 2 servos moving with the legs.
    • This way, the less leg weight the less unexpected inertia, causing a better dynamic behavior.
    • Also the robot weight should be as less as posible becouse hobby servos can be overcharged easily.
    • About the mass distribution, this would define the Center of Mass which should be the closest to the Geometric Center, for this the building must be as simetric as posible, this way CoM is easier to locate and the control would be more predictable.
    • The lenght of the leg is defined at 100 mm for both tibia a femur, this way, a motor with 20 Kg/cm will hold 2 kg at a 10 cm arm. So the longer the arm, the less force the motor can support.
    • Avoid plastic vs plastic joints, this would involve unnecessary friction forces.

    Printing premises.

    • This is simple but is important to take this into account, at the time of printing, these parts are not supporting much mechanical forces, for this and taking into account building premises, the 3D printed parts should be printed with ~10-20% infill , 1-2 outer perimeters and 2-3 solid layers.
  • 2
    Building and printing the robot.

    For building this i have made this tutorial, as there are some tricky parts.

    In the file section, each STL is named with the quantity of parts you need to print, if none just one needs to be printed.

    Also you can download files all at once at the file section, file STL_files.zip.

    Bolt list:

    • M3 x45 mm: quantity 12
    • M3 x30 mm: quantity 26
    • M3 x25 mm: quantity 16
    • M3 x20 mm: quantity 8
    • M3 x15 mm: quantity 16
    • M3 x7 mm: quantity 4
    • M3 threaded rod x 42 mm: quantity 4
    • M3 threaded rod x 74 mm: quantity 8

    Plus its respectives nuts and washers where is needed.

  • 3
    Electronics.

    Here is the electronics setup i have:

    You can download the full size PDF file of the scheme at the file section.

    As you can see, appart from the 4 power lines in order to power the servomotors, there is also a voltage divider in order to read the batterie voltage in the arduino from an analog pin and then 5 LEDs will indicate the batterie status. Finally, IMU is connected via I2C interface. Raspberry pi is powered via another bulk converter at 5V 3A, that i miss (i will correct this in the next updates) and arduino communicates via USB cable (which also power it) to raspberry pi.

    I made two modular small boards in order to easily connect and disconnect servos, these boards are conected to the big one, which will supply all the pins to the arduino, being the 12 pwm signals with its commun ground and the I2C interface for the IMU. Also here, i can add sensors (FRS) the same way for the future.

    Details of the building:

    • Starting from the small board that supplies the servo power and the digital signal for the arduino, they have shared GNDs and are divided into 4 different lines, one for each bulk converter. They are connected to the main board, which accommodates the IMU and two of these boards.
    • The other components that accomodates some electronics is the back cover of the robot, where the Emergency Botton is placed along with the 5-LED battery indicator, also the voltage divider in orther to read the battery voltage is placed here, attached with some hot glue.
    • Finally everything is prepared to be mount, notice that bulk converters are placed between the fermur servos and attached with tape. Also, you can see how Raspberry Pi and Arduino MEGA are attached to the top of the robot.
    • Finally, the battery is made with individual cells, so i have to charge them one by one.

View all 3 instructions

Enjoy this project?

Share

Discussions

Alior101 wrote 05/13/2020 at 16:34 point

Cool! Is this the 180 or 270 pro version of  DS3218?

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/13/2020 at 17:17 point

180 PRO version, as for example, the tibia movement is limited to 110 dregrees.

  Are you sure? yes | no

Alex Tacescu wrote 05/13/2020 at 15:58 point

Super cool project! Are you calculating forward/inverse kinematics on the Arduino? Or is everything done in the physics engine?

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/13/2020 at 17:21 point

I'm calculating them on the python sketch in the computer/raspberry. For now i'm not using data from the simulation, only i sent to the simulation the angles for every joint, the same as in the real robot.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates