Close
0%
0%

DIY hobby servos quadruped robot

Cheap 3D printed 4 legged robot, that almost looks like boston dynamics spot but moves like a newborn.

Public Chat
Similar projects worth following
Being a undergradute in physics and bored of solving paper problems, i decided to solve and apply a real world problems so i started this robotics project in order to introduce my self to control theory, studying its maths and practicing pragramming.
It runs with Raspberry Pi 4 as brain, plus an Arduino Mega for reading/writing of signals of 12 hobby servos and IMU, controlled by a PS3 Dualshock.

General Summary

The code which is mostly written by me, using python3 along with some typical libraries for the task, for now implements a walking loop inside a kinematic model, so you can command a pose to the body, as well as feet position and also velocity and directinal commands for the walking loop. Controlled by a ps3 controller via bluetooth.

Arduino is used for speed up the reading and writing of signals, so in the future i can add other sensor, as well as position feedback...

Lets see what this under 500$ robot can do:

Project log index (updated):

  1. Model and code explanation.
    1. The kinematic model.
    2. Step trajectory and Gait planner.
    3. Arduino code  and serial communication.
    4. Raspberry Pi communication interfaces.
    5. Calibrating the servos.
    6. PyBullet Simulation example.
    7. Future 3DoF foot sensor.
  2. Robot experiments.
    1. A simple walking method.
    2. Real robot vs PyBullet simulation.
    3. Hitting the robot.
    4. Quadruped robot meets my cat.
    5. A more stable walking.

💻💻💻💻 Code of the robot.


📚📚📚📚 Bibliography and readings of the project.


🔨🔨🔨🔨Bill of materials and price.


🏆🏆🏆🏆Makers that built the robot.


Why building a quadruped robot?

Appart from the interesting control problems that this robot involves, there are lot of interesting tasks this it can carry out and they are beginning to be demonstrated, as we have seen with Spot robot in the 2020 pandemic:

It's obious that there is still lot of work to do, but we are at the time where we can built one of these in our homes and have a very good aproximation of it.

There is a key factor in this robot, it doesn't need roads in orther to operate, so it can reach places that are very hard for a wheeled machine, so these 4-legged robot are ideal for tasks such as surveillance, recognition of areas not passable by vehicles and even rescues in areas with danger of collapse.

Finally, it is clear for now, that my robot isn't able to do those tasks, but for now i am satisfied that the robot walks stable and who knows if in the future it will be able to bring me a beer from the fridge, just by telling to it: 'dog bring me a beer!'

What problems will you face with this robot?

These legged robot have always amazed me, but it was hard to find easy to understand research, as well as a non-trivial mecanical designs plus complex maths. This makes the project hard to achive good results with DIY resources. So i rolled up my slevees and started reading papers from different investigation groups in the subject.

What my project apport to the maker community?

As this project is multidiciplinary, appart from designing my own version of the robot, i focused on its maths, solving different problems related with the movement of the robot, making a very looking forward model which only implements the maths necessary to run the robot, showing how every equation is related with the real robot and givin a general view of it.

This is not the most robust way of building a model, as in real world robots there are lot of additional maths methods in order to pulish the behaviour of the robot, for example, adding interpolation methods would make the robot performs smoother or building an state stimator + filter (kalman state stimator) would let you do very preccise meassurements on the robot and make its movements even more natural.


Visual scheme of electronics and model.

As you can see in the scheme, in this setup i use one bulk converter por 3 servos, with 4 bulk converters in total. This is because each servo drains up to 4A peak.

NOTE: This would not be a rigorous way of wiring, it is just a drawing.

Licensing

This project uses the GPL v3 license for all software related.

This project uses the Attribution-NonCommercial-ShareAlike 4.0 International for all hardware components.

---------- more ----------

Robot Update (AUG 2020)...

Read more »

x-zip-compressed - 2.35 MB - 08/12/2020 at 18:25

Download

x-zip-compressed - 5.94 MB - 08/12/2020 at 18:36

Download

x-zip-compressed - 8.55 MB - 08/12/2020 at 18:36

Download

Adobe Portable Document Format - 82.65 kB - 07/07/2020 at 00:34

Preview
Download

Standard Tesselated Geometry - 605.26 kB - 08/12/2020 at 18:25

Download

View all 28 files

  • 12 × Servo DS3218 PRO version. 20 kg/cm torque and 0.1 s/60º.
  • 12 × Metal horns for the servos. Plastic one are not a choose, as they generate play.
  • 3 × 2x 18650 battery holders in serie (7,4 V)
  • 20 × 8x3x4 mm bearings (autter x inner x width)
  • 1 × MPU6050 IMU. accelerometer + gyrocope.

View all 11 components

  • Project bibliography and interesting papers.

    Miguel Ayuso Parrilla10/02/2020 at 13:42 0 comments
  • Small community.

    Miguel Ayuso Parrilla09/28/2020 at 15:32 3 comments

    This log is just to show the repercussion of the project and say thanks for the support from all of you that are reading this project.

    This is awesome to see that the hard work you made can be very useful for others. Even to the point of creating a new platform with a small community that is growing every day.

    This project has let me see the values of the OPEN SOURCE community, how easy is nowadays to develop a new platform, each contributor from very different places in the word but with one commun interest that is learn things in order to make our lives more interesting.

    Here are some of Makers that supported the project since the first day and they already have a small Chop at their homes:

    Thanks to @OilSpigot 

    (min 4:00 to see his robot in action)

    Thanks to @Creasento:



  • Bill of materials and price of the robot.

    Miguel Ayuso Parrilla09/19/2020 at 12:33 0 comments

    To complete the Component Section, in this log i want to show the complete bill of materials, with the price for one unit of the robot, i have to say i'm surprised how cheap it is to build just one robot.

    In my case, this development has been more expensive as i have made 3 different prototypes with its resvective components, so more or less i have spent two times the price of one unit.

    Almost all componet were bought from the typical chenese online wareshops.

    Once you have all the components, the building procress is not very hard if you have experience with 3D printed prototyping.

    Go to Instruction Section

  • PyBullet Simulation example

    Miguel Ayuso Parrilla09/15/2020 at 13:06 0 comments

    This log i'm going to talk about the simulation used for the robot development, this was mostly used for debugging the maths inside the the different models.

    The code of the simulation is no longer implemented in the Raspberry code, as it can't support GUI mode and i'm not taking data from PyBullet environment. The code i am referring to is at the V1_PCsimulation branch of the robot GitHub repository.


    This code can be ran with the robot just plugging the arduino to the PC via USB and run the robot_walking.py file, all requirements at README.md

    But if you don't have the robot built, you can run just the simulation a play along with the robot inside the PyBullet environment. Also feel free to modify it and improve it.

    You just need PyBullet and Numpy on Python 3, then download the V1_PCsimulation branch and run walking_simulation_example.py.

    More info of how the simulation works shown in the next video:


    Now, i'm going to breightly explain the main code of the simulation, here is shown:

    """
    Created on Sun Mar 15 19:51:40 2020
    @author: linux-asd
    """
    
    import pybullet as p
    import numpy as np
    import time
    import pybullet_data
    from pybullet_debuger import pybulletDebug  
    from kinematic_model import robotKinematics
    from gaitPlanner import trotGait
    
    """Connect to rhe physics client, you can choose GUI or DIRECT mode"""
    physicsClient = p.connect(p.GUI)#or p.DIRECT for non-graphical version
    p.setAdditionalSearchPath(pybullet_data.getDataPath()) #optionally
    """set Gravity"""
    p.setGravity(0,0,-9.8)
    
    """import the robot's URDF file, if it is fixed no ground plane is needed""" 
    cubeStartPos = [0,0,0.2]
    FixedBase = False #if fixed no plane is imported
    if (FixedBase == False):
        p.loadURDF("plane.urdf")
    boxId = p.loadURDF("4leggedRobot.urdf",cubeStartPos, useFixedBase=FixedBase)
    
    jointIds = []
    paramIds = [] 
    time.sleep(0.5)
    """get info along all the joints for debugging if needed"""
    for j in range(p.getNumJoints(boxId)):
    #    p.changeDynamics(boxId, j, linearDamping=0, angularDamping=0)
        info = p.getJointInfo(boxId, j)
        print(info)
        jointName = info[1]
        jointType = info[2]
        jointIds.append(j)
        
    footFR_index = 3
    footFL_index = 7
    footBR_index = 11
    footBL_index = 15   
    
    """init the classes used"""
    pybulletDebug = pybulletDebug()
    robotKinematics = robotKinematics()
    trot = trotGait() 
    
    #robot properties
    """initial foot position"""
    #foot separation (Ydist = 0.16 -> tetta=0) and distance to floor
    Xdist = 0.20
    Ydist = 0.15
    height = 0.15
    """This matrix will define where the feet are going to be at its standing position
    with respect the center of the robot"""
    #body frame to foot frame vector
    bodytoFeet0 = np.matrix([[ Xdist/2 , -Ydist/2 , -height],
                             [ Xdist/2 ,  Ydist/2 , -height],
                             [-Xdist/2 , -Ydist/2 , -height],
                             [-Xdist/2 ,  Ydist/2 , -height]])
    """Gait definition parameters, here you can set different period of step and
    the offset between every foot's loop, defining different gaits"""
    T = 0.5 #period of time (in seconds) of every step
    offset = np.array([0.5 , 0. , 0. , 0.5]) #defines the offset between each foot step in this order (FR,FL,BR,BL)
    
    """start a real time simulation, so no simulation steps are needed"""
    p.setRealTimeSimulation(1)
        
    while(True):
        lastTime = time.time()
    """show sliders in the GUI"""
        pos , orn , L , angle , Lrot , T = pybulletDebug.cam_and_robotstates(boxId)  
        #calculates the feet coord for gait, defining length of the step and direction (0º -> forward; 180º -> backward)
        bodytoFeet = trot.loop(L , angle , Lrot , T , offset , bodytoFeet0)
    
    #####################################################################################
    #####   kinematics Model: Input body orientation, deviation and foot position    ####
    #####   and get the angles, neccesary to reach that position, for every joint    ####
        FR_angles, FL_angles, BR_angles, BL_angles , transformedBodytoFeet = robotKinematics.solve(orn , pos , bodytoFeet)
         
    """move all joints at the next time step angle"""
        #move movable joints
        for i in range(0, footFR_index):
     p.setJointMotorControl2(boxId,...
    Read more »

  • Future 3DoF foot sensor.

    Miguel Ayuso Parrilla08/27/2020 at 20:52 0 comments

    In order to make the robot dynamic in the precisse meaning of the word, we need to have meassurements of the forces acting acting inside the robot, this can be either from the actuators, feet or the IMU.

    This way we can build a robust control algorithm taking into account these forces, such as the Inverse Pendulum model. Basically what we want in here is to implement a physics math model in stead of a direct model like the Kinematics model it implements now.

    This task is not easy to achive, but i found a paper with a very inteligent solution to this problem: Foot design for a hexapod walking robot.

    Basically it uses 3 sensitive resistance force sensor, with a proper orientation in order to read the forces at the foot in the 3 different directions.

    What i came with is shown in this pictures, but this is only a quick design with lot of errors at the time of reading accurate meassurements.


    Some actual works on the MIT cheetah show how usefull can be this type of foot sensor:

  • Calibrating the servos.

    Miguel Ayuso Parrilla08/01/2020 at 09:15 0 comments

    First of all, you will need to plug all servos to their proper pin, in my case here they are (you can change this for your how setup):

    void connectServos() {
                          //FR
      Servos[0].attach(40); //coxa
      Servos[1].attach(38); //femur
      Servos[2].attach(36); //tibia
                          //FL
      Servos[3].attach(42); //coxa
      Servos[4].attach(44); //femur
      Servos[5].attach(46); //tibia
                         //BR
      Servos[6].attach(34); //coxa
      Servos[7].attach(32); //femur
      Servos[8].attach(30); //tibia
                         //BL
      Servos[9].attach(48); //coxa
      Servos[10].attach(50); //femur
      Servos[11].attach(52); //tibia
    }

    As soon as the servos are corretly attached, what i made in order to put the servos on its correct position is just to dissasembly the legs, i mean, separate the servo output from the its horn, in order to let the servo move freely. This way you can power the servos, let them go to the zero position and the reassembly.

    I wrote a small sketch in order to put the servos on their zero position, you can find it at servo_calibration.py file, this way you can attatch the servo very close to their zero position.

    Then, a more precise calibration is needed, you will need to calibrate the offset of each servo, those offset are defined at angleToPulse.py file. This file only contains the line defining the pulse corresponding to each angle for every servo.

    Being a line in the form: y = Ax + B. The offset of each servo is defined by B, which mean the pulse at the zero angle.

    If you are using different servos, the slope of the line (A) will be different so you will need to identify it. What i did, is just to take 4-5 meassurement of pulse at different angles and put them on an excel where i calculate the adjucted line from those points.

    def convert(FR_angles, FL_angles, BR_angles, BL_angles):
        pulse = np.empty([12])
        #FR
        pulse[0] = int(-10.822 * np.rad2deg(-FR_angles[0])) + 950
        pulse[1] = int(-10.822 * np.rad2deg(FR_angles[1])) + 2280
        pulse[2] = int(10.822 * (np.rad2deg(FR_angles[2]) + 90)) + 1000
        #FL
        pulse[3] = int(10.822 * np.rad2deg(FL_angles[0])) + 1020 
        pulse[4] = int(10.822 * np.rad2deg(FL_angles[1])) + 570
        pulse[5] = int(-10.822 * (np.rad2deg(FL_angles[2]) + 90)) + 1150
        #BR
        pulse[6] = int(10.822 * np.rad2deg(-BR_angles[0])) + 1060 
        pulse[7] = int(-10.822 * np.rad2deg(BR_angles[1])) + 2335 
        pulse[8] = int(10.822 * (np.rad2deg(BR_angles[2]) + 90)) + 1200
        #BL
        pulse[9] = int(-10.822 * np.rad2deg(BL_angles[0])) + 890
        pulse[10] = int(10.822 * np.rad2deg(BL_angles[1])) + 710
        pulse[11] = int(-10.822 * (np.rad2deg(BL_angles[2]) + 90)) + 1050
        return pulse

  • A more stable walking.

    Miguel Ayuso Parrilla07/20/2020 at 11:49 0 comments

    This log is about polishing the gaitPlanner as you can see in the next video:


    What is interesting on this video is that no control algorithm is running, but it seems very stable.

    This is achived just by separating stance phase from the swing (as explained in the gaitPlanner.py log), this way i can define an offset between both phases, so running the stance phase at 3/4 of the hole step, makes the robot to hold its weigth on the 4 legs for a short period of time being a bit more stable, just by running the gaitPlanner loop.
    Visually the gait is defined like follows:

    Aditional info about that is that servos got very hot if it runs for more than 10 minutes.

  • Quadruped robot meets my cat.

    Miguel Ayuso Parrilla07/07/2020 at 18:29 0 comments

    As a quick update of the V2 robot performing i have introduced it to my cat.

    My cat is very introverted and usually doesn't like new things, in this case isn't different but it seems to cause she some curiosity.

    The robot performs well, as it uses the last version of the code, the extra weigth of the batteries and raspberry pi is not a problem as the new design is much more lighter and the servos are now powered at 6V in stead of 5V as in the first version.

    On the other hand, i have notice the model randomly gives bad solutions running on the raspberry pi, thats why sometimes you see the leg make a strange movement. I haven't identify the problem yet, maybe is my software problem or bufer overflow.

    There is no aim to hurt the animal, is my cat omg.

  • Real robot vs PyBullet simulation.

    Miguel Ayuso Parrilla06/26/2020 at 14:45 0 comments

    In this video i compare the real robot model with the simulation model:

    I'm using PyBullet physics engine, it is simple to use and seems to have a good performance as well as very precise results. In the video you can see that the simulation is a bit slow, this is becouse the simulation (at the time of the video) was not working at real time but computing the simulation every 20 ms.

  • Raspberry Pi communication interfaces.

    Miguel Ayuso Parrilla06/21/2020 at 21:41 0 comments

    This log is going to be about the python communication files, explaining the interfaces used in order to communicate both with the ps3 controller and with the arduino.

    Serial Bidirectional Communication on python.

    This is written in the serial_com.py file and the code goes as follow:

    • First thing is to define the serial port and its baudrate.
    import serial
    
    class ArduinoSerial: 
        def __init__(self , port):
            #ls -l /dev | grep ACM to identify serial port of the arduino
            self.arduino = serial.Serial(port, 500000, timeout = 1)
            self.arduino.setDTR(False) #force an arduino reset
            time.sleep(1)
            self.arduino.flushInput()
            self.arduino.setDTR(True)
    • Then the communication is done with the next two fuctions. This fuctions are very general and can be used without big chnges.

      This program will stop the main loop until the next data arrives, this way the arudino memory is not overloaded with the pulse commands arriving.

    def serialSend(self, pulse):  
            comando = "<{0}#{1}#{2}#{3}#{4}#{5}#{6}#{7}#{8}#{9}#{10}#{11}>" #Input
            command=comando.format(int(pulse[0]), int(pulse[1]), int(pulse[2]), 
                                       int(pulse[3]), int(pulse[4]), int(pulse[5]), 
                                       int(pulse[6]), int(pulse[7]), int(pulse[8]), 
                                       int(pulse[9]), int(pulse[10]), int(pulse[11]))
            self.arduino.write(bytes(command , encoding='utf8'))
    
        def serialRecive(self):
            try:
                startMarker = 60
                endMarker = 62
                getSerialValue = bytes()
                x = "z" # any value that is not an end- or startMarker
                byteCount = -1 # to allow for the fact that the last increment will be one too many
                # wait for the start character
                while  ord(x) != startMarker: 
                    x = self.arduino.read()
                    # save data until the end marker is found
                while ord(x) != endMarker:
                    if ord(x) != startMarker:
                        getSerialValue = getSerialValue + x 
                        byteCount += 1
                    x = self.arduino.read()
                loopTime , Xacc , Yacc , roll , pitch  = numpy.fromstring(getSerialValue.decode('ascii', errors='replace'), sep = '#' )  
    


    Connecting PS3 controller to the Raspberry Pi and reading its events.

    • Then events are read on the joystick.py file, using evdev python library. In this set up, the events are read every time the read() function is called, ignoring the others ongoing events. Also there isn't any queue method or filtering.
    from evdev import InputDevice, categorize, ecodes
    from select import select
    import numpy as np
    
    class Joystick:
        def __init__(self , event):
            #python3 /usr/local/lib/python3.8/dist-packages/evdev/evtest.py for identify event
            self.gamepad = InputDevice(event)
    
        def read(self):
            r,w,x = select([self.gamepad.fd], [], [], 0.)
            if r:
                for event in self.gamepad.read():
                    if event.type == ecodes.EV_KEY:
                        if event.value == 1:
                            if event.code == 544:#up arrow
                                self.T += 0.05
                            if event.code == 545:#down arrow
                                self.T -= 0.05
                            if event.code == 308:#square
                                if self.compliantMode == True:
                                    self.compliantMode = False
                                elif self.compliantMode == False:
                                    self.compliantMode = True    
                        else:
                            print("boton soltado")
    
                    elif event.type == ecodes.EV_ABS:
                        absevent = categorize(event)
                        if ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_X":  
                            self.L3[0]=absevent.event.value-127
                        elif ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_Y":
                            self.L3[1]=absevent.event.value-127
                        elif ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_RX":
                            self.R3[0]=absevent.event.value-127
                        elif ecodes.bytype[absevent.event.type][absevent.event.code] == "ABS_RY":
                            self.R3[1]=absevent.event.value-127
    

View all 15 project logs

  • 1
    Building premises.

    I'm going to enumerate some important concepts i have taken into account for the robot design.

    • This robots depends hardly on its mass distribution and dynamics of the moving parts (legs). Thats why i decided to go for this servo configuration, having the coxa servo on the body, the femur servo on the coxa and tibia servo on the the femur, so there is only 2 servos moving with the legs.
    • This way, the less leg weight the less unexpected inertia, causing a better dynamic behavior.
    • Also the robot weight should be as less as posible becouse hobby servos can be overcharged easily.
    • About the mass distribution, this would define the Center of Mass which should be the closest to the Geometric Center, for this the building must be as simetric as posible, this way CoM is easier to locate and the control would be more predictable.
    • The lenght of the leg is defined at 100 mm for both tibia a femur, this way, a motor with 20 Kg/cm will hold 2 kg at a 10 cm arm. So the longer the arm, the less force the motor can support.
    • Avoid plastic vs plastic joints, this would involve unnecessary friction forces.

    Printing premises.

    • This is simple but is important to take this into account, at the time of printing, these parts are not supporting much mechanical forces, for this and taking into account building premises, the 3D printed parts should be printed with ~10-20% infill , 1-2 outer perimeters and 2-3 solid layers.
  • 2
    Building and printing the robot.

    For building this i have made this tutorial, as there are some tricky parts.

    In the file section, each STL is named with the quantity of parts you need to print, if none just one needs to be printed.

    Also you can download files all at once at the file section, file STL_files.zip.

    If you are building the robot, take into account this: 

    Some servos has a small ridge in the holding holes for making this part stronger, for my desing i did cut these small ridges in order to make the parts easier to print and design.


    Bolt list:

    • M3 x45 mm: quantity 12
    • M3 x30 mm: quantity 26
    • M3 x25 mm: quantity 16
    • M3 x20 mm: quantity 8
    • M3 x15 mm: quantity 16
    • M3 x7 mm: quantity 4
    • M3 threaded rod x 42 mm: quantity 4
    • M3 threaded rod x 74 mm: quantity 8

    Plus its respectives nuts and washers where is needed.

  • 3
    Electronics.

    Here is the electronics setup i have:

    You can download the full size PDF file of the scheme at the file section.

    As you can see, appart from the 4 power lines in order to power the servomotors, there is also a voltage divider in order to read the batterie voltage in the arduino from an analog pin and then 5 LEDs will indicate the batterie status. Finally, IMU is connected via I2C interface. Raspberry pi is powered via another bulk converter at 5V 3A, that i miss (i will correct this in the next updates) and arduino communicates via USB cable (which also power it) to raspberry pi.

    I made two modular small boards in order to easily connect and disconnect servos, these boards are conected to the big one, which will supply all the pins to the arduino, being the 12 pwm signals with its commun ground and the I2C interface for the IMU. Also here, i can add sensors (FRS) the same way for the future.

    Details of the building:

    • Starting from the small board that supplies the servo power and the digital signal for the arduino, they have shared GNDs and are divided into 4 different lines, one for each bulk converter. They are connected to the main board, which accommodates the IMU and two of these boards.
    • The other components that accomodates some electronics is the back cover of the robot, where the Emergency Botton is placed along with the 5-LED battery indicator, also the voltage divider in orther to read the battery voltage is placed here, attached with some hot glue.
    • Finally everything is prepared to be mount, notice that bulk converters are placed between the fermur servos and attached with tape. Also, you can see how Raspberry Pi and Arduino MEGA are attached to the top of the robot.
    • Finally, the battery is made with individual cells, so i have to charge them one by one.

View all 3 instructions

Enjoy this project?

Share

Discussions

Alan Churichi wrote 06/20/2020 at 18:04 point

Hi! Really nice work!

Do you have schematics files for the connections?

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 06/21/2020 at 21:01 point

Thanks Alan! I'm working on it, as soon as i solder the new electronics i will post the schematics.

  Are you sure? yes | no

albertson.chris wrote 05/15/2020 at 00:40 point

Will you post th CAD files?   Already I'd like to try building just one leg for testing and I can think of a few changes.

One other question:  It looks like the prototype in the video was missing the Pi and batteries.  How much weight could be added?   Certainly the Pi4, batteries and see DC/DC regulator will need to be addd but also some sensors.

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/15/2020 at 10:07 point

Yes! i'm finishing them, i think in the next 3 month i will print all the parts left, correct them if there are mistakes and then share them.

Yes, batteries, raspi and its cables are taken into account in the design, they add about 500-700 g, but also i'm removing weight in the design. Another diference between prototype and wireless version is that cable is going at 5V and batteries at 7.4V.

So i think, there will be no problem with this extra weight

  Are you sure? yes | no

Mustafa wrote 05/14/2020 at 18:11 point

If you prepare a basic udemy course we can buy it

  Are you sure? yes | no

Mustafa wrote 05/14/2020 at 16:50 point

Miguel I am following your project, thanks for those precious informations.. Please continue..

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/15/2020 at 10:08 point

Thanks a lot man! Of course i'll be documenting all the project.

  Are you sure? yes | no

Comedicles wrote 05/14/2020 at 06:17 point

Does it have accelerometers and inertial orientation? Put the poor thing out of its misery and place it on a  rubber surface so that leg motion can correspond to movement through space. I can tell it is miserable!

  Are you sure? yes | no

Steve wrote 05/13/2020 at 17:23 point

I loaded the STLs into Ideamaker(3D-printer slicer) and it shows up super-tiny, like it's in inches not millimeters. What are the dimensions supposed to be  if we wanted to print this out ?
Thanks

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/13/2020 at 22:40 point

Those files are not printable, are only for show them on the simulation, thats why u see them very small as they are meassured on meters. If you look closer to them, there is no way to build or mount them with the hardware

  Are you sure? yes | no

Steve wrote 05/14/2020 at 00:10 point

Is there a way to print it out if we wanted to work on one ourselves or is that not ready yet ?

  Are you sure? yes | no

Steve wrote 05/14/2020 at 00:14 point

Anyway, this is still very cool and I'm looking forward to reading about it more as it progresses along.

  Are you sure? yes | no

Alior101 wrote 05/13/2020 at 16:34 point

Cool! Is this the 180 or 270 pro version of  DS3218?

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/13/2020 at 17:17 point

180 PRO version, as for example, the tibia movement is limited to 110 dregrees.

  Are you sure? yes | no

Alex Tacescu wrote 05/13/2020 at 15:58 point

Super cool project! Are you calculating forward/inverse kinematics on the Arduino? Or is everything done in the physics engine?

  Are you sure? yes | no

Miguel Ayuso Parrilla wrote 05/13/2020 at 17:21 point

I'm calculating them on the python sketch in the computer/raspberry. For now i'm not using data from the simulation, only i sent to the simulation the angles for every joint, the same as in the real robot.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates