Close
0%
0%

DIY hobby servos quadruped robot

Cheap 3D printed 4 legged robot, that almost looks like boston dynamics spot but moves like a newborn.

Public Chat
Similar projects worth following
Being a undergradute in physics and bored of solving paper problems, i decided to solve and apply a real world problems so i started this robotics project in order to introduce my self to control theory, studying its maths and practicing pragramming.
It runs with Raspberry Pi 4 as brain, plus an Arduino Mega for reading/writing of signals of 12 hobby servos and IMU, controlled by a PS3 Dualshock.

General Summary

The code which is mostly written by me, using python3 along with some typical libraries for the task, for now implements a walking loop inside a kinematic model, so you can command a pose to the body, as well as feet position and also velocity and directinal commands for the walking loop. Controlled by a ps3 controller via bluetooth.

Arduino is used for speed up the reading and writing of signals, so in the future i can add other sensor, as well as position feedback...

Lets see what this under 500$ robot can do:


Project log index (updated):

  1. Model and code explanation.
    1. The kinematic model.
    2. Step trajectory and Gait planner.
    3. Arduino code  and serial communication.
    4. Raspberry Pi communication interfaces.
    5. Calibrating the servos.
    6. PyBullet Simulation example.
    7. Future 3DoF foot sensor.
    8. Improving the leg mechanism and PCB board.
  2. Robot experiments.
    1. A simple walking method.
    2. Real robot vs PyBullet simulation.
    3. Hitting the robot.
    4. Quadruped robot meets my cat.
    5. A more stable walking.

💻💻💻💻 Code of the robot.


📚📚📚📚 Bibliography and readings of the project.


🔨🔨🔨🔨Bill of materials and price.


🏆🏆🏆🏆Makers that built the robot.


Why building a quadruped robot?

Appart from the interesting control problems that this robot involves, there are lot of interesting tasks this it can carry out and they are beginning to be demonstrated, as we have seen with Spot robot in the 2020 pandemic:

It's obious that there is still lot of work to do, but we are at the time where we can built one of these in our homes and have a very good aproximation of it.

There is a key factor in this robot, it doesn't need roads in orther to operate, so it can reach places that are very hard for a wheeled machine, so these 4-legged robot are ideal for tasks such as surveillance, recognition of areas not passable by vehicles and even rescues in areas with danger of collapse.

Finally, it is clear for now, that my robot isn't able to do those tasks, but for now i am satisfied that the robot walks stable and who knows if in the future it will be able to bring me a beer from the fridge, just by telling to it: 'dog bring me a beer!'

What problems will you face with this robot?

These legged robot have always amazed me, but it was hard to find easy to understand research, as well as a non-trivial mecanical designs plus complex maths. This makes the project hard to achive good results with DIY resources. So i rolled up my slevees and started reading papers from different investigation groups in the subject.

What my project apport to the maker community?

As this project is multidiciplinary, appart from designing my own version of the robot, i focused on its maths, solving different problems related with the movement of the robot, making a very looking forward model which only implements the maths necessary to run the robot, showing how every equation is related with the real robot and givin a general view of it.

This is not the most robust way of building a model, as in real world robots there are lot of additional maths methods in order to pulish the behaviour of the robot, for example, adding interpolation methods would make the robot performs smoother or building an state stimator + filter (kalman state stimator) would let you do very preccise meassurements on the robot and make its movements even more natural.


Visual scheme of electronics and model.

As you can see in the scheme, in this setup i use one bulk converter por 3 servos, with 4 bulk converters in total. This is because each servo drains up to 4A peak.

NOTE: This would not be a rigorous way of wiring, it is just a drawing.

Licensing

This project uses the GPL v3 license for all software related.

This project uses the Attribution-NonCommercial-ShareAlike 4.0 International for all hardware components.

Read more »

x-zip-compressed - 2.35 MB - 08/12/2020 at 18:25

Download

x-zip-compressed - 5.94 MB - 08/12/2020 at 18:36

Download

x-zip-compressed - 8.55 MB - 08/12/2020 at 18:36

Download

Adobe Portable Document Format - 82.65 kB - 07/07/2020 at 00:34

Preview
Download

Standard Tesselated Geometry - 605.26 kB - 08/12/2020 at 18:25

Download

View all 28 files

  • 12 × Servo DS3218 PRO version. 20 kg/cm torque and 0.1 s/60º.
  • 12 × Metal horns for the servos. Plastic one are not a choose, as they generate play.
  • 3 × 2x 18650 battery holders in serie (7,4 V)
  • 20 × 8x3x4 mm bearings (autter x inner x width)
  • 1 × MPU6050 IMU. accelerometer + gyrocope.

View all 11 components

  • Improving the leg mechanism and new PCB board.

    Miguel Ayuso Parrilla11/01/2022 at 02:17 0 comments

    It's been a while since I don't update the project but it doesn't mean I've stopped working on the robot ;)

    About the code, there is still a very primivite version of the model at Github. I've been also improving it but still some cleaning up of the code needed as I want to show it clear to understand. Hope on next months it cound be done!

    You can check further updates on Instagram


    Desinging a PCB board

    I want to share PCB I made to interface all conections. Making wiring by hand is always tedious and it can be cosidered like an art.

    But now is very easy to design and manufacture your own PCBs for a very small budget.

    The board is very simple and make the wiring job very straight forward. The carateristics are:

    • Teensy 4.1 interface. It is very fast, which can handle even LCD and achieve high frequency.
    • I2C bus for 9-DOF Absolute Orientation IMU, bno055 in my case.
    • Serial tx/rx and Vin to power Teensy.
    • 13 servo connections divided in 4 different power supply.
    • 13 analog input, in other to read each servo's potentiometer.
    • Voltage divider to read battery voltage.
    • 3V3 output.
    • Push button.

    ATTENTION!: If powering Teensy through 5V pin power supply, don't connect USB supply. As both voltage at the same time can damage microcontroller.

    Manufacture it is very easy now, Thanks to PCBway we get high quality PCB plus very good service, solving any manufacturing doubt very fast. It was very easy to order the boards!

    You can get PCB files here.



    ________________________________________________________________________________


    Improving the leg mechanism.


    Along this 3 years I have been trying several leg mechanism.

    Finally I decided to do a simple design with tibial motor placed on the femur joint.

    But this design still had several problems, like it wasn't very robust and the most importat is that having the motor (with big mass) that far from the rotating axis, caused in fast movements, unwanted dynamics to the robot body, making controlability worse.

    New version have both motors of femur/tibial limb at coxa frame, this ends with a very simple setup and at the same time, the heaviest masses of the mechanism are centered to the rotating axis of coxa limb, so even though the leg do fast movements, inertias won't be strong enough to affect the hole robot mass, achieving more agility.

    Inverse Kinematics of the mechanism

    After building it I notice that this mechanism was very special for another reason, at the domain the leg normally moves, it acts as a diferential mecanism, this means that torque is almost all the time shared between both motor of the longer limbs. That was an improvent since with the old mechanism tibial motor had to hold most of the weight and it was more forced than the one for femur.

    To visualize this, for the same movement, we can see how tibial motor must travel more arc of angel that the one on the new version.

    In order to solve this mechanism, just some trigonometry is needed. Combining both cosine and sine laws, we can obtain desired angle (the one between femur and tibia) with respect to the angle the motor must achieve.

    Observing these equations, with can notice that this angle (the one between femur and tibia) depends on both servos angles, which means both motors are contributing to the movement of the tibia.


    Calibration of servos

    Another useful thing to do if we want to control servo precisely is to print a calibration tool for our set up. As shown in the image below, in order to know where real angles are located, angle protactor is placer just in the origin of the rotating joint, and choosing 2 know angles we can match PWM signal to the real angles we want to manipulate simply doing a lineal relation between angles and PWM pulse length.

    Then a simple program in the serial console can be wrtten to let the user move the motor to the desired angle. This way the calibration process is only about placing motor at certain position...

    Read more »

  • New leg design.

    Miguel Ayuso Parrilla07/29/2022 at 12:20 4 comments

    This log is just to let all know that I'm still working on the robot. It's been quite a long time without any update (Studiying physics require must of my time this past years) but I've been doing some important progress on leg design and simulation.

    From now I'll be updating and producing more content about the robot. Stay tuned!

    Until then, to ilustrate what is coming, here a render of the new leg.

    Testings on this design have shown a quite remarkable improvement on performance.

  • Project bibliography and interesting papers.

    Miguel Ayuso Parrilla10/02/2020 at 13:42 0 comments
  • Small community.

    Miguel Ayuso Parrilla09/28/2020 at 15:32 3 comments

    This log is just to show the repercussion of the project and say thanks for the support from all of you that are reading this project.

    This is awesome to see that the hard work you made can be very useful for others. Even to the point of creating a new platform with a small community that is growing every day.

    This project has let me see the values of the OPEN SOURCE community, how easy is nowadays to develop a new platform, each contributor from very different places in the word but with one commun interest that is learn things in order to make our lives more interesting.

    Here are some of Makers that supported the project since the first day and they already have a small Chop at their homes:

    Thanks to @OilSpigot 

    (min 4:00 to see his robot in action)

    Thanks to @Creasento:



  • Bill of materials and price of the robot.

    Miguel Ayuso Parrilla09/19/2020 at 12:33 0 comments

    To complete the Component Section, in this log i want to show the complete bill of materials, with the price for one unit of the robot, i have to say i'm surprised how cheap it is to build just one robot.

    In my case, this development has been more expensive as i have made 3 different prototypes with its resvective components, so more or less i have spent two times the price of one unit.

    Almost all componet were bought from the typical chenese online wareshops.

    Once you have all the components, the building procress is not very hard if you have experience with 3D printed prototyping.

    Go to Instruction Section

  • PyBullet Simulation example

    Miguel Ayuso Parrilla09/15/2020 at 13:06 0 comments

    This log i'm going to talk about the simulation used for the robot development, this was mostly used for debugging the maths inside the the different models.

    The code of the simulation is no longer implemented in the Raspberry code, as it can't support GUI mode and i'm not taking data from PyBullet environment. The code i am referring to is at the V1_PCsimulation branch of the robot GitHub repository.


    This code can be ran with the robot just plugging the arduino to the PC via USB and run the robot_walking.py file, all requirements at README.md

    But if you don't have the robot built, you can run just the simulation a play along with the robot inside the PyBullet environment. Also feel free to modify it and improve it.

    You just need PyBullet and Numpy on Python 3, then download the V1_PCsimulation branch and run walking_simulation_example.py.

    More info of how the simulation works shown in the next video:


    Now, i'm going to breightly explain the main code of the simulation, here is shown:

    """
    Created on Sun Mar 15 19:51:40 2020
    @author: linux-asd
    """
    
    import pybullet as p
    import numpy as np
    import time
    import pybullet_data
    from pybullet_debuger import pybulletDebug  
    from kinematic_model import robotKinematics
    from gaitPlanner import trotGait
    
    """Connect to rhe physics client, you can choose GUI or DIRECT mode"""
    physicsClient = p.connect(p.GUI)#or p.DIRECT for non-graphical version
    p.setAdditionalSearchPath(pybullet_data.getDataPath()) #optionally
    """set Gravity"""
    p.setGravity(0,0,-9.8)
    
    """import the robot's URDF file, if it is fixed no ground plane is needed""" 
    cubeStartPos = [0,0,0.2]
    FixedBase = False #if fixed no plane is imported
    if (FixedBase == False):
        p.loadURDF("plane.urdf")
    boxId = p.loadURDF("4leggedRobot.urdf",cubeStartPos, useFixedBase=FixedBase)
    
    jointIds = []
    paramIds = [] 
    time.sleep(0.5)
    """get info along all the joints for debugging if needed"""
    for j in range(p.getNumJoints(boxId)):
    #    p.changeDynamics(boxId, j, linearDamping=0, angularDamping=0)
        info = p.getJointInfo(boxId, j)
        print(info)
        jointName = info[1]
        jointType = info[2]
        jointIds.append(j)
        
    footFR_index = 3
    footFL_index = 7
    footBR_index = 11
    footBL_index = 15   
    
    """init the classes used"""
    pybulletDebug = pybulletDebug()
    robotKinematics = robotKinematics()
    trot = trotGait() 
    
    #robot properties
    """initial foot position"""
    #foot separation (Ydist = 0.16 -> tetta=0) and distance to floor
    Xdist = 0.20
    Ydist = 0.15
    height = 0.15
    """This matrix will define where the feet are going to be at its standing position
    with respect the center of the robot"""
    #body frame to foot frame vector
    bodytoFeet0 = np.matrix([[ Xdist/2 , -Ydist/2 , -height],
                             [ Xdist/2 ,  Ydist/2 , -height],
                             [-Xdist/2 , -Ydist/2 , -height],
                             [-Xdist/2 ,  Ydist/2 , -height]])
    """Gait definition parameters, here you can set different period of step and
    the offset between every foot's loop, defining different gaits"""
    T = 0.5 #period of time (in seconds) of every step
    offset = np.array([0.5 , 0. , 0. , 0.5]) #defines the offset between each foot step in this order (FR,FL,BR,BL)
    
    """start a real time simulation, so no simulation steps are needed"""
    p.setRealTimeSimulation(1)
        
    while(True):
        lastTime = time.time()
    """show sliders in the GUI"""
        pos , orn , L , angle , Lrot , T = pybulletDebug.cam_and_robotstates(boxId)  
        #calculates the feet coord for gait, defining length of the step and direction (0º -> forward; 180º -> backward)
        bodytoFeet = trot.loop(L , angle , Lrot , T , offset , bodytoFeet0)
    
    #####################################################################################
    #####   kinematics Model: Input body orientation, deviation and foot position    ####
    #####   and get the angles, neccesary to reach that position, for every joint    ####
        FR_angles, FL_angles, BR_angles, BL_angles , transformedBodytoFeet = robotKinematics.solve(orn , pos , bodytoFeet)
         
    """move all joints at the next time step angle"""
        #move movable joints
        for i in range(0, footFR_index):
     p.setJointMotorControl2(boxId,...
    Read more »

  • Future 3DoF foot sensor.

    Miguel Ayuso Parrilla08/27/2020 at 20:52 0 comments

    In order to make the robot dynamic in the precisse meaning of the word, we need to have meassurements of the forces acting acting inside the robot, this can be either from the actuators, feet or the IMU.

    This way we can build a robust control algorithm taking into account these forces, such as the Inverse Pendulum model. Basically what we want in here is to implement a physics math model in stead of a direct model like the Kinematics model it implements now.

    This task is not easy to achive, but i found a paper with a very inteligent solution to this problem: Foot design for a hexapod walking robot.

    Basically it uses 3 sensitive resistance force sensor, with a proper orientation in order to read the forces at the foot in the 3 different directions.

    What i came with is shown in this pictures, but this is only a quick design with lot of errors at the time of reading accurate meassurements.


    Some actual works on the MIT cheetah show how usefull can be this type of foot sensor:

  • Calibrating the servos.

    Miguel Ayuso Parrilla08/01/2020 at 09:15 0 comments

    First of all, you will need to plug all servos to their proper pin, in my case here they are (you can change this for your how setup):

    void connectServos() {
                          //FR
      Servos[0].attach(40); //coxa
      Servos[1].attach(38); //femur
      Servos[2].attach(36); //tibia
                          //FL
      Servos[3].attach(42); //coxa
      Servos[4].attach(44); //femur
      Servos[5].attach(46); //tibia
                         //BR
      Servos[6].attach(34); //coxa
      Servos[7].attach(32); //femur
      Servos[8].attach(30); //tibia
                         //BL
      Servos[9].attach(48); //coxa
      Servos[10].attach(50); //femur
      Servos[11].attach(52); //tibia
    }

    As soon as the servos are corretly attached, what i made in order to put the servos on its correct position is just to dissasembly the legs, i mean, separate the servo output from the its horn, in order to let the servo move freely. This way you can power the servos, let them go to the zero position and the reassembly.

    I wrote a small sketch in order to put the servos on their zero position, you can find it at servo_calibration.py file, this way you can attatch the servo very close to their zero position.

    Then, a more precise calibration is needed, you will need to calibrate the offset of each servo, those offset are defined at angleToPulse.py file. This file only contains the line defining the pulse corresponding to each angle for every servo.

    Being a line in the form: y = Ax + B. The offset of each servo is defined by B, which mean the pulse at the zero angle.

    If you are using different servos, the slope of the line (A) will be different so you will need to identify it. What i did, is just to take 4-5 meassurement of pulse at different angles and put them on an excel where i calculate the adjucted line from those points.

    def convert(FR_angles, FL_angles, BR_angles, BL_angles):
        pulse = np.empty([12])
        #FR
        pulse[0] = int(-10.822 * np.rad2deg(-FR_angles[0])) + 950
        pulse[1] = int(-10.822 * np.rad2deg(FR_angles[1])) + 2280
        pulse[2] = int(10.822 * (np.rad2deg(FR_angles[2]) + 90)) + 1000
        #FL
        pulse[3] = int(10.822 * np.rad2deg(FL_angles[0])) + 1020 
        pulse[4] = int(10.822 * np.rad2deg(FL_angles[1])) + 570
        pulse[5] = int(-10.822 * (np.rad2deg(FL_angles[2]) + 90)) + 1150
        #BR
        pulse[6] = int(10.822 * np.rad2deg(-BR_angles[0])) + 1060 
        pulse[7] = int(-10.822 * np.rad2deg(BR_angles[1])) + 2335 
        pulse[8] = int(10.822 * (np.rad2deg(BR_angles[2]) + 90)) + 1200
        #BL
        pulse[9] = int(-10.822 * np.rad2deg(BL_angles[0])) + 890
        pulse[10] = int(10.822 * np.rad2deg(BL_angles[1])) + 710
        pulse[11] = int(-10.822 * (np.rad2deg(BL_angles[2]) + 90)) + 1050
        return pulse

  • A more stable walking.

    Miguel Ayuso Parrilla07/20/2020 at 11:49 0 comments

    This log is about polishing the gaitPlanner as you can see in the next video:


    What is interesting on this video is that no control algorithm is running, but it seems very stable.

    This is achived just by separating stance phase from the swing (as explained in the gaitPlanner.py log), this way i can define an offset between both phases, so running the stance phase at 3/4 of the hole step, makes the robot to hold its weigth on the 4 legs for a short period of time being a bit more stable, just by running the gaitPlanner loop.
    Visually the gait is defined like follows:

    Aditional info about that is that servos got very hot if it runs for more than 10 minutes.

  • Quadruped robot meets my cat.

    Miguel Ayuso Parrilla07/07/2020 at 18:29 0 comments

    As a quick update of the V2 robot performing i have introduced it to my cat.

    My cat is very introverted and usually doesn't like new things, in this case isn't different but it seems to cause she some curiosity.

    The robot performs well, as it uses the last version of the code, the extra weigth of the batteries and raspberry pi is not a problem as the new design is much more lighter and the servos are now powered at 6V in stead of 5V as in the first version.

    On the other hand, i have notice the model randomly gives bad solutions running on the raspberry pi, thats why sometimes you see the leg make a strange movement. I haven't identify the problem yet, maybe is my software problem or bufer overflow.

    There is no aim to hurt the animal, is my cat omg.

View all 17 project logs

  • 1
    Building premises.

    I'm going to enumerate some important concepts i have taken into account for the robot design.

    • This robots depends hardly on its mass distribution and dynamics of the moving parts (legs). Thats why i decided to go for this servo configuration, having the coxa servo on the body, the femur servo on the coxa and tibia servo on the the femur, so there is only 2 servos moving with the legs.
    • This way, the less leg weight the less unexpected inertia, causing a better dynamic behavior.
    • Also the robot weight should be as less as posible becouse hobby servos can be overcharged easily.
    • About the mass distribution, this would define the Center of Mass which should be the closest to the Geometric Center, for this the building must be as simetric as posible, this way CoM is easier to locate and the control would be more predictable.
    • The lenght of the leg is defined at 100 mm for both tibia a femur, this way, a motor with 20 Kg/cm will hold 2 kg at a 10 cm arm. So the longer the arm, the less force the motor can support.
    • Avoid plastic vs plastic joints, this would involve unnecessary friction forces.

    Printing premises.

    • This is simple but is important to take this into account, at the time of printing, these parts are not supporting much mechanical forces, for this and taking into account building premises, the 3D printed parts should be printed with ~10-20% infill , 1-2 outer perimeters and 2-3 solid layers.
  • 2
    Building and printing the robot.

    For building this i have made this tutorial, as there are some tricky parts.

    In the file section, each STL is named with the quantity of parts you need to print, if none just one needs to be printed.

    Also you can download files all at once at the file section, file STL_files.zip.

    If you are building the robot, take into account this: 

    Some servos has a small ridge in the holding holes for making this part stronger, for my desing i did cut these small ridges in order to make the parts easier to print and design.


    Bolt list:

    • M3 x45 mm: quantity 12
    • M3 x30 mm: quantity 26
    • M3 x25 mm: quantity 16
    • M3 x20 mm: quantity 8
    • M3 x15 mm: quantity 16
    • M3 x7 mm: quantity 4
    • M3 threaded rod x 42 mm: quantity 4
    • M3 threaded rod x 74 mm: quantity 8

    Plus its respectives nuts and washers where is needed.

  • 3
    Electronics.

    Here is the electronics setup i have:

    You can download the full size PDF file of the scheme at the file section.

    As you can see, appart from the 4 power lines in order to power the servomotors, there is also a voltage divider in order to read the batterie voltage in the arduino from an analog pin and then 5 LEDs will indicate the batterie status. Finally, IMU is connected via I2C interface. Raspberry pi is powered via another bulk converter at 5V 3A, that i miss (i will correct this in the next updates) and arduino communicates via USB cable (which also power it) to raspberry pi.

    I made two modular small boards in order to easily connect and disconnect servos, these boards are conected to the big one, which will supply all the pins to the arduino, being the 12 pwm signals with its commun ground and the I2C interface for the IMU. Also here, i can add sensors (FRS) the same way for the future.

    Details of the building:

    • Starting from the small board that supplies the servo power and the digital signal for the arduino, they have shared GNDs and are divided into 4 different lines, one for each bulk converter. They are connected to the main board, which accommodates the IMU and two of these boards.
    • The other components that accomodates some electronics is the back cover of the robot, where the Emergency Botton is placed along with the 5-LED battery indicator, also the voltage divider in orther to read the battery voltage is placed here, attached with some hot glue.
    • Finally everything is prepared to be mount, notice that bulk converters are placed between the fermur servos and attached with tape. Also, you can see how Raspberry Pi and Arduino MEGA are attached to the top of the robot.
    • Finally, the battery is made with individual cells, so i have to charge them one by one.

View all 3 instructions

Enjoy this project?

Share

Discussions

hoannaoh wrote 03/02/2024 at 07:24 point

Hi Miguel

How can you calculated and choose servo? I just saw in this project you shared you reference about that and didn't see how you calculated.

  Are you sure? yes | no

SlinkyDogLag wrote 07/29/2023 at 20:11 point

Is it possible to use an Arduino Uno for this project?

  Are you sure? yes | no

Raa wrote 06/29/2023 at 12:24 point

Hello Sir can help me I made this project but I'm using only Arduino for my program but I don't how to control Servo with inverse kinematics I want to test with my Serial monitor. Please help me have example code waking robot dog I try to make this for finished my exam 

  Are you sure? yes | no

jeff wrote 03/05/2022 at 17:18 point

Hi Miguel, this is an excellent project. I setup a Python3 environment to try and run walking_simulation_example.py. Everything appears to load correctly. However, the robot in the simulation flips over on its back and does nothing else. I assume I have set something up incorrectly. Do you have any advise on what the problem could be?

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 07/29/2022 at 12:01 point

Hi Jeff, maybe simualtion code was not well manteined and some bugs appeared during last updates. I'm sure this comes from URDF/simulation cofigurations. Code will be realeased soon with more stable behavior.

  Are you sure? yes | no

zonhou wrote 12/17/2021 at 08:06 point

Hi, i would like to know if i can change the raspberry pi 4B to raspberry pi pico?

  Are you sure? yes | no

Marc wrote 11/26/2021 at 22:27 point

Hi Miguel, I built the project with a few changes.  Here's a video link to my 'chop' in action.

https://youtu.be/dPs3IqKK56k

More detailed pictures:

https://share.icloud.com/photos/0-dKsGwwySdxO7aC8YIgP7zIg#Lancaster

https://share.icloud.com/photos/0_Ec41BSFRRfTSoeZqfCkJpCA#Lancaster
Notes:

1. The Raspberry Pi I was using died so I used one of the Jetson Nanos I had lying around.  The software didn’t need any changes for the Nano and I prefer developing on the Nano vs. the Pi.

2. I centered all the servos, then put everything together so the joint angles were at the middle of the range.  At this point I wrote a script to move each servo (slowly) with a joystick and recorded the extents of the safe positions.  I limit the servo commands to within the safe positions before sending the command to the Arduino.  The 20 kg servos can easily break 3D printed parts.

3. I used a little PioLed display (works on the Pi too) for showing information like battery level, modes and errors.

4. Power is important.  Use the proper wire gauge and good quality BEC units.  Some of the stuff on Amazon is crap.

5. Build a stand.  See second picture.  It makes life easier.

6. As Miguel said, weight is a concern.  Using the heavier Nano didn’t help.

Thank you for creating this awesome project.  You are brilliant!

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 07/29/2022 at 11:58 point

Hi Marc,

I apreciate your comment. Helpfull tips!

  Are you sure? yes | no

Ahmed Gaber wrote 11/18/2021 at 09:07 point

Hi, I am building the project now based on your data

please tell me if there is any updated data not uploaded in the code or in the hardware ?

  Are you sure? yes | no

Finn wrote 07/21/2021 at 10:53 point

Hi!

Great Project:-)

As i am new and totally noob at software, is there a guide how to install the code to RPI and Arduino?

Thanks for sharing

  Are you sure? yes | no

Average Joe wrote 06/05/2021 at 21:46 point

Why did you choose an Arduino Mega over an Arduino Uno?

  Are you sure? yes | no

davek00997744 wrote 05/11/2021 at 09:16 point

Hi is there any update on this - i have printed a few leg parts but they do not match the video and there is not enough detail to help me assemble. I will add that i am a retired beginner :)  Or is this not a project suitable for a noob??

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/11/2021 at 23:31 point

There are some on the code but not on the hardware atm. I just sent you a private massage about the build. Let's talk there.

  Are you sure? yes | no

Zinc-OS wrote 03/02/2021 at 21:04 point

Seen many quadroped-type bots.

  Are you sure? yes | no

Zinc-OS wrote 03/02/2021 at 21:05 point

Would like something new

  Are you sure? yes | no

deʃhipu wrote 05/11/2021 at 20:59 point

Like a quintuped robot?

  Are you sure? yes | no

hj.jawedkhan wrote 02/23/2021 at 05:55 point

hey,which software that you use for coding?

  Are you sure? yes | no

CyberStefNef wrote 01/17/2021 at 18:11 point

Hey, I am so close to getting it running. I just have the problem that whenever I run the program the legs on the right side are turning completely up and down again. On the way down the back one is blocking the front one. Is this normal? I found out, that the legs are turning up after the command: self.arduino.setDTR(True) in the serial.py. Any suggestions?

  Are you sure? yes | no

tfx2 wrote 10/23/2020 at 20:15 point

What are the joint limits, upper/lower for each of the joints? It's not defined in the URDF.

It looks like there are two model numbers of DS3218 PRO, one for 180° rotation and the other for 270°.

  Are you sure? yes | no

Lance Cassidy wrote 10/22/2020 at 21:05 point

Thank you for sharing this Miguel!

For those that might be new to electrical engineering, I recreated the schematic diagram here to show each of the individual pin connections, made a few small corrections, and added links to source the parts mentioned in the BOM. You can find the part source links in part properties when you click them.

Here’s the updated schematic diagram: https://bit.ly/35oQA9a

Feel free to fork this document to make edits and improvements! I’m looking forward to the MIT Cheetah implementation :)

As for some of the corrections, I realized that the serial connection you show between the Raspberry Pi and the Arduino mega should actually be a USB serial bus. The pins should probably be data D+ and data D- instead of TX and RX.

It looks like you’re already aware, but as I was redrawing the schematic I realized that you are missing a bulk converter for the Raspberry pi that powers the Arduino and servos. I added that to the updated schematic. 

Hope this helps!

  Are you sure? yes | no

rodrigomurilloaranda wrote 10/16/2020 at 21:46 point

You did a wonderful job, although I saw that you gave as a reading recommendation the "Dynamic Locomotion in the MIT Cheetah 3 Through Convex Model-Predictive Control", but i did not saw it implemented in the code. I was wondering if you implemented it in your code.

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 10/18/2020 at 22:46 point

Thanks mate! This paper is refered just to show how the real MIT Cheetah robot works, also it is very interesting to read, and there is some interesting concepts about control systems i have used but not directly implemented any model from there.

We can't implement this type of dynamic control model with servo robots, as there isn't any torque or contact force feedback, even so, if it is achived, this MPC will require a very highly precise force readings in order to make the model stable. One more thing about that, is that servos work at lower frequency than the MPC should work, so this doesn't make any sense to implement this powerful MPC inside a very slow and unprecise system like my robot.

  Are you sure? yes | no

rodrigomurilloaranda wrote 10/28/2020 at 20:32 point

Thank you, I´m new in this topic. Actually my objective is to implement a model of the cheetah using MPC, but your work helped me to understand the theory applied to an actual model that actually works. Once again thanks for your contribution.

P.D.(Puedes hablarme en español, es mi lengua materna si lo prefieres)                      

  Are you sure? yes | no

stevensturgis wrote 10/06/2020 at 16:09 point

What would be the ramification of making this at 200% ? The servo load  needed and mechanically. Would the programing still work?  would 12v 60kg servos work? I would like to build but its very small.  I would like to use it for stairs in the future.

  Are you sure? yes | no

Gaultier Lecaillon wrote 09/21/2020 at 11:50 point

Hi !

I see that you use an I2C interface to communicate between the pi and the arduino to control servos.
What about a PCA9685 Servo Driver that will allow you to control 16 servo and do without the arduino board and the UMI I2C interface ?

see details here: https://cdn-learn.adafruit.com/downloads/pdf/adafruit-16-channel-servo-driver-with-raspberry-pi.pdf

As you no longer need an arduino board, you can control actuator directly from the layer application in the python script.

Thanks for your work, very impressive !

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 09/28/2020 at 14:46 point

Hi mate! Sorry for late response, i've been very busy.

For the RPi- Arduino communication i don't use I2C, i'm using Serial Commans, these are very simple of cofiguring but isn't the best way.

I know about PCA9685 board, but i don't have really good experience with it as the powering of the servos is made inside the board and for this task you will burn the board as it can't hold the 36Amps peak of these 12 servos.

Appart from controlling the servos, the microcontroller (Arduino) is also used for reading the IMU data and filter it, read and display batterie status and also it will be used for reading other sensors i will put in the future, so for me one microcontroller is necesary.

Maybe nowadays the best option we have are Teensy boards.

Tones thanks for the support!

  Are you sure? yes | no

Charan wrote 08/04/2020 at 10:48 point

Hi sir! I'm completely new to hardware and I'd like to learn how to build cool robots like these. 

Can you please cite some good resources (books/websites/videos/courses/hands-on learning projects or whatever you can think of) for kickstarting hardware hacking ?

Btw, here's a brief of my background

I'm a mech engg. undergrad and I  am familiar with a bit of Python and MATLAB. I can do CAD stuff well.

(I'm not sure if it's appropriate to ask this here, since it's unrelated to the project...I just thought maybe if there's someone else like me, they'll find this comment thread useful...)

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 08/06/2020 at 17:26 point

Hi! Nice question, as there is lot of people asking me that and my advise is as simple as read. For that is important to know the level you are at so you can understand the reading.

It is truth that if you are a beginner, there is lot of videos at youtube that are very useful, but they are just introductions.

Now the question is, what to read? Personally i read lot of other people's projects, these are personal projects but now i'm also reading profesional projects or cientific researchments, where the information is trully tested and you can take the right solutions for what you are searching.

Also we can't forget to read code, because for learning coding we need to practice but at first stage we need to learn structures by reading.

About the resources, for now i can recommend you Ignite robots academy for learning ROS because it is what i'm using now. If you are interested on the resources i used for the robot, they are named along the explaining logs.

I'm sorry not to tell you specific resources, but there are tones of them on internet and along you read projects you will find the resources used.

  Are you sure? yes | no

Charan wrote 08/07/2020 at 03:41 point

Thanks a lot for your suggestion!

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates