Quadruped robot with 22 DoF

This is Quadbot 17, a work-in-progress quadruped robot with 22 degrees of freedom, based on Robotis AX-12 motors and a 3D-printed chassis.

Similar projects worth following
This quadruped robot was born out of a learning exercise in Autodesk Fusion 360. The robot hardware is mostly complete, with legs and chassis designed, built and assembled. A Python test program currently solves the inverse kinematics of the leg and base. Some simple walking gaits have been implemented. Future work will focus on finishing the hardware, building more complicated walking routines and mounting a 3D sensor for environmental perception and SLAM.

In addition to trying to provide this robot build with autonomous behaviour (e.g. SLAM navigation), another objective is to explore various innovative ways of controlling its moving and walking behaviour, and its interaction with the user. The walking and steering motion of the robot can already be controlled via the GUI, keyboard or XBox One game controller.

So I am currently working on the improvement of the current user-controlled functions, as well as the implementation of some interesting new ones:

  • Fine-tune input for controlling the walking and steering motion of the robot.
  • Exploring interesting ways that a robot 'tail' can interact with the user.
  • Adding a 3D sensor head to the robot, such as an Intel RealSense depth camera, and visualising the environment.
  • Updating the user's graphical interface, based on previous work done using a Qt-based GUI written in C++ and integrated with the USB motor controller as well as the ROS ecosystem.

More details and progress on this project can be found on my blog:

Cognitive Dissonance

FlashPrint project files (.fpp) and 3D printer files (.x3g)

x-zip-compressed - 23.29 MB - 10/14/2018 at 14:22


STL files for custom 3D-printed parts, exported from Autodesk Fusion 360

x-zip-compressed - 5.08 MB - 10/14/2018 at 14:11



Quantities of Robotis robot 3-pin cables required, at different lengths

sheet - 9.77 kB - 10/14/2018 at 14:24



Early-stage bill of materials (from 26/02/2017)

sheet - 9.74 kB - 10/14/2018 at 14:24



Denavit-Hartenberg parameters for the robot's kinematics

sheet - 12.97 kB - 08/26/2018 at 16:29


  • 22 × Dynamixel AX-12A Robot Actuator
  • 2 × Bioloid 6-Port Cable Hub
  • 1 × Robotis OpenCM9.04
  • 1 × Robotis OpenCM 485 Expansion Board
  • 8 × Aluminium sheet plate 250 x 250 x 1.5 mm Used for custom aluminium parts

View all 20 components

  • Depth sensor

    Dimitris Xydas10/14/2018 at 15:17 0 comments

    I have finally made a decision on the sensor and ordered the Intel RealSense D435 Depth Camera! I also found high-quality CAD models of the whole D400 series here: Production CAD Files for Intel RealSense D400 Series.

    After importing into Fusion 360 and adding a touch of texturing, I added it to the robot model to show a sense of scale. It needs a 3D printed mount to be properly attached to the top of the front body; or better yet, a re-design of the front bumper to integrate the camera inside it, giving an unobstructed forward-facing view.

    The camera will be controlled by PC/laptop for now, but will later need to be controlled by a beefier on-board controller.

  • Forward planning

    Dimitris Xydas10/05/2018 at 17:50 0 comments

    Path generation and following

    The current walking gait system is mostly just a test of the kinematics and user inputs. To make the robot truly manoeuvrable, it requires sensors and better foot positioning techniques. The following diagrams are my initial attempt at trying to decide how exactly to achieve this. This is a very first look, and the actual implementation will probably vary significantly, but it provides a good starting point to start exploring further ideas:

    The problem is broken down into three levels:

    • Level 1: Navigation
      • Scanning of area with 3D sensor
      • Making decisions on how to approach a goal
    • Level 2: Stepping
      • Deciding on appropriate gait parameters
      • Generating the foot target positions within the environment
    • Level 3: 
      • Planning and executing the foot trajectories
      • Reacting to any obstacles encountered by the feet

    Required hardware updates

    These are the current hardware updates that will be needed soon, in order to improve and then progress further than basic walking gaits:

    1) Torsion springs

    Reducing the load on critical joints will be achieved by adding torsion springs to the hip/knee motors.

    2) Foot sensors

    This is a must in order for the robot feet to account for rough or uneven terrain, and load distribution of the robot. I've been looking at ways of adding foot sensors via the AX-12 daisy-chain and not separately, to avoid adding multiple cable bundles to each foot. The only off-the-shelf existing option I have found is too expensive (ROBOTIS OP2-FSR Set). However, a perfect open-source solution looks like the Force foot design by Rhoban (thanks to B[] for the suggestion!).

    3) 3D sensor

    The ability to get useful information from the environment can be achieved by adding a 3D sensor head to the robot, such as an Intel RealSense depth camera. This however will have significant knock-on effects on the rest of the design, such as on power consumption, and the need to have a powerful on-board processing, which the Raspberry Pi won't be able to hack.

    More future ideas

    Better control

    Replacement of the motors would only be an option in an entirely new hardware iteration of this robot, as the design is based around the AX-12s. However, I may explore upgrades to the motor control, beyond AX-12s' internal controllers with compliance. Two potential options:

    • External PID loop wrapped around the existing internal controller
    • Firmware updates such as this one, in order to have direct PID control


    I would like to better understanding of the dynamics of leg locomotion, and developing a Simulink model would help. Something like this but on four legs would really be the ultimate goal!

  • Heads or Tails?

    Dimitris Xydas09/16/2018 at 22:49 0 comments

    The tail has been built!

    The CAD model went through a couple of iterations before deciding on the final form: from 8 polygonal sections to six smoother and smaller sections.

    When 3D printing, the base section went through a re-design, since the original idea of just gluing the small section to the base was clearly not going to provide enough stability. I also removed yet another link, to avoid having the tail ending up too big and heavy.

    The base section of the tail now screws onto two of the existing holes on the rear base, meaning there is no need to modify it, other than replacing two M3 bolts with longer ones.

    The tail links are all held together with a 150 mm long, 5 mm diameter spring, scavenged from a flexible long reach pick-up tool.

    To make the models easy to 3D print, I sliced them all down the centre, then glued the halves together Loctite super glue works well with PLA (the gel type works best). There were a few failed prints in the process, which I put down to rushing, and using PLA which has been out in the air gathering moisture for several months! Prints were made on a Flashforge Creator Pro, with 15% infill.

    Here are some pictures of the printing and assembly progress, leading to the final result:

  • Chassis assembled

    Dimitris Xydas09/12/2018 at 22:03 0 comments

    After 3D printing a few more plastic parts and cutting all the aluminium plates, the custom chassis was finally complete! Below are some notes on the remaining 3D parts and the metal plates.

    More 3D printed parts and painting

    I printed off some of the remaining parts of the chassis: The battery compartment was best printed upright with minimal support structure needed. The rear bumper was trickier than the front bumper, because of the additional hole space for the battery, so I found it was best to print upright, with a raft and curved supports at the bottom.

    Once all parts were printed, and some more sanding, I spray-painted all the parts with plastic primer, then blue paint, and finally clear sealer.

    More parts:

    Metal parts

    I was initially thinking of finding an online service to cut out the aluminium chassis parts, but then decided it would be faster and cheaper to just get some sheets 1.5 mm thick aluminium sheets from eBay and cut them on a jigsaw table. I used Fusion 360’s drawing tool to export to PDF the parts I needed to cut out: four chassis plates and four foot plates. I then printed them in actual scale and glued them on the aluminium to uses as traces.


    I threaded the holes on all the 3D parts, which were either 3 mm wide where the aluminium plates attach, or 2 mm at the leg and spine bracket attachment points. Using a tap for the 3 mm holes worked pretty well, but the 2 mm holes were more prone to being stripped or too loose, so manually threading the holes with the bolts worked better. Another issue was the infill surrounding the internal thread cylinder sometimes being a bit too thin. In retrospect, I’d try designing the 3D parts to use heat-set or expandable inserts, especially for the smaller threads.

    The servo brackets attaching to the chassis have a large number of holes (16 for each leg and one of the spine brackets, and 12 for the other spine bracket) so the screws so far seem to secure the brackets well enough. The spine section is under a lot of stress from the wight of the whole chassis and legs, so it will not be able to withstand much twisting force, and the servos may not be strong enough at this area, but I will have to test this in practice with new walking gaits.


    The custom chassis has finally made it from a 3D design to a reality, with relative success so far. Some of the threaded holes on the 3D parts are not as strong as I’d like, the AX-12 may be under-powered for the spine connection, and the brackets anchoring the spine may be the first to give way due to twisting forces. Also the chassis as a whole would benefit form weight-saving exercise and perhaps being thinned down. But this has only been the first iteration of the main chassis, and the robot design has now become a reality and seems to stand up well.

  • From paper to plastic

    Dimitris Xydas09/11/2018 at 23:46 0 comments

    … or more correctly, from CAD to reality, as it is time for 3D printing!

    Initially, before getting my own 3D printer, I used Shapeways to print copies of some of the Robotis plastic brackets (they were hard to source online).

    At one point I also calculated the costs for the custom 3D printed parts, by getting quotes from Shapeways. The basic parts required totalled to £1,000! Luckily I ended up getting a FlashForge Creator Pro 2017 soon after, so was able to print the structural  parts at home for much cheaper.

    Chassis parts

    After first designing the initial version of each part in Fusion 360, I the made an update pass with some practical updates such as:

    • adding fillets around the edges
    • decreasing nut hole diameters by 0.2 mm in order to provide some material for self-tapping threads
    • increased the tolerance of some slots by the same amount, to allow a tolerance for their connection to interlocking plastic tabs
    • modifying the rear section to provide more rigidity to the central connection with the spine servo bracket, by adding a 90° tab to the rear underside aluminium base

    Here are some images of the CAD models of the chassis parts:

    Foot Base

    Front body assembly

    Rear body assembly


    All parts were printed in PLA plastic.

    The first part I started with was the foot base. I printed it with a 20% honeycomb infill. I didn’t add any intermediate solid layers, but might do so in other parts.

    Each leg connects to a leg base bracket, which is the same design for all legs. The part was printed “upside-down” because of the orientation of the interlocking tabs. This meant that some support structure was needed for the holes. For the first print attempt I also added supports around the overhang of the filleted edge, along with a brim, but for the subsequent prints I didn’t bother with these, as the fillet overhang held fine without supports, and saved from extra filing/sanding down. These parts also used 20% infill.

    For the front and rear “bumpers”, I reduced the infill to 10%.

    For the larger part comprising of the central section of the front, the spine front bracket, I also used an infill of 10%. Due to the more complicated design that would have included many overhangs, I found it easier to cut the part lengthwise and print it as two separate pieces. These will be super-glued together after sanding.

    Time-lapse GIFs and images of the printing process:

    Front Bumper

    Spine Front Bracket

    Parts & Assembly

    In terms of printing times, the foot bases and leg base brackets took about 3 hours each, the bumpers took around 4 hours each, and the two spine front bracket halves took about 7 hours combined, so total printing time is was fairly large!

    The 0.2 mm clearance seems to work fine for self-threading the plastic with M2 size metal nuts, but was too large for some of the plastic-to-plastic interlocking tabs, possibly since this tolerance is close to the resolution limits of the printer (theoretically a 0.4 mm nozzle and 0.18 mm layer height). However after some filing and sanding down, all the plastic parts fit together nicely.

    The resulting 3D prints before and after sanding:

    Finally, here are some images of how the chassis assembly shaped up, as well as the foot bases shown attached to the foot metal brackets. These fitted snug without any sanding, and all the holes aligned perfectly with the metal brackets, which was reassuring!

    The next step was to glue the front bracket halves together, print the remaining parts, and paint everything.

  • Tails, sensors, user interaction and interfaces

    Dimitris Xydas08/26/2018 at 18:08 0 comments

    In addition to trying to provide this robot build with autonomous behaviour (e.g. SLAM navigation), another objective is to explore various innovative ways of controlling its moving and walking behaviour., and its interaction with the user. The walking and steering motion of the robot can already be controlled via the GUI, keyboard or XBox One game controller.

    So I am currently working on the improvement of the current user-controlled functions, as well as the implementation of some interesting new ones:

    Fine-tune input for controlling the walking and steering motion of the robot

    I have added a new input mode which allows the predefined walking gaits to be scrolled through via the keyboard or controller inputs. In effect, this means the controller can be used to “remote-control” the walking of the robot. The walking gaits still need a lot of tuning, but the basic function is now implemented.

    Exploring interesting ways that a robot 'tail' can interact with the user

    I am currently working on the design of a tail for the robot, as a way of adding more animal-like quality to the design, as well as a fun way of making the robot convey 'emotion' and other types of feedback. I'm not sure of the exact method of actuation, it might not have any! The design will be composed of multiple parts with similar design but with decreasing sizes, which result in a tapered snake-like appendage:

    The parts will have a hollow central section, which will allow a flexible material to be threaded through for support. So far the best material I have found is the long spring section from a flexible long reach pick-up tool, which can be found cheap on eBay:

    Adding a 3D sensor head to the robot

    I originally had two ideas for area scanners which could be the main “eyes” of the robot. One is the Kinect v2, and the other a Scanse Sweep.

    The main advantages of the Sweep is that it is designed specifically for robotics, with a large range and operation at varying light levels. On its own it only scans in a plane by spinning 360°, however it can be turned into a spherical scanner with little effort.

    The Kinect has a good resolution and is focused on tracking human subjects, being able to track 6 complete skeletons and 25 joints per person. However it only works optimally indoors and at short ranges directly in front of it. It is significantly cheaper than the Sweep, but much bulkier.

    However, more recently I have been looking at a third option from Intel's range of depth cameras. The Intel RealSense ZR300 seemed like an ideal choice, but since it has been discontinued, its successors from the D400 series seem like the best choice.

    I rendered the various options on the robot for a size comparison:

    Updating the user's graphical interface

    In previous work with a robot that uses the same servos, I created a Qt-based GUI written in C++ and integrated the USB motor controller as well as the ROS ecosystem.

    My idea is to re-use many of the components in this interface, in order to improve the interaction with the quadruped. This should be fairly simple, as the GUI was mostly indifferent to the physical configuration of the motors.

    Here are some screenshots showing some of its features:

  • Walking gait smoothing and tuning

    Dimitris Xydas08/26/2018 at 14:08 0 comments

    I added some exponential smoothing to the original walking gaits, to smooth the edges of the trajectories and create a more natural movement.

    I then added a ±30° pitching motion to what best approximates the ‘ankle’ (4th joint), to emulate the heel off and heel strike phases of the gait cycle.

    The range of motion of the right ankle joint. Source: Clinical Gate.

    I realised however that applying the pitch to the foot target is not exactly the same as applying the pitch to the ankle joint. This is because, in the robot’s case, the ‘foot target’ is located at the centre of the lowest part of the foot, that comes into contact with the ground, whereas the ankle is higher up, at a distance of two link lengths (in terms of the kinematics, that’s a4+a5). The walking gait thus does not ends up producing the expected result (this is best explained by the animations are the end).

    To account for this, I simply had to adjust the forward/backward and up/down position of the target, in response to the required pitch.

    With some simple trigonometry, the fwd/back motion is adjusted by -A*sin(pitch), while the up/down motion is adjusted by +A*(1-cos(pitch)), where A is the distance a4+a5 noted previously.

    Here are the details showing how the targets are adjusted for one particular phase in the creep walk gait (numbers involving translations are all normalised because of the way the gait code currently works):

    Creep walk gait, 25-step window where the foot target pitch oscillates between ±30°. Original target translations (fwd/back, up/down) are shown compared to the ones adjusted in response to the pitch.

    The final results of the foot pitching, with and without smoothing, are shown below:

    Original – Creep – Unsmoothed
    Adjusted – Creep – Unsmoothed

    Original – Creep – Smoothed
    Adjusted – Creep – Smoothed

    Original – Walk – Unsmoothed
    Adjusted – Walk – Unsmoothed

    Original – Walk – Smoothed
    Adjusted – Walk – Smoothed


  • Steering input adjustments

    Dimitris Xydas08/26/2018 at 13:58 0 comments

    Following on from the previous post on walking and steering, I realised that when moving the spine joints, the rear feet remain anchored to the ground, when it would be better if they rotated around the spine motors, to give a better turning circle for steering.

    The reason behind why the feet remain fixed is because their targets are being defined in world coordinate space, so moving the spine won’t change the target.

    There are advantages to defining the targets in world space for future work, when the robot knows more information about its environment. For example the legs can be positioned in the world in order to navigate upcoming terrain or obstacles. But for now, it is often useful to work in coordinates local to the base (front base for front legs, and rear base for rear legs), since in this way you don’t have to worry about the relative positioning of the front base w.r.t. rear.

    I will eventually update the kinematics code so either world or local targets can be selected.

    For now however, I have made an update to the code, so if the spine joint sliders, gaits or walking/steering inputs are used, the rear leg targets move with the spine. To explain this better visually:



    Another minor adjustment you might notice was the widening of the stance, to provide a larger support polygon. The walking gaits still need fine-tuning, as walking on the actual robot is still unstable and slow.

  • Walking and steering inputs

    Dimitris Xydas08/26/2018 at 13:56 0 comments

    Further progress on steering input controls:

    First, I have updated the walking gaits with additional target values. Second, I have added a new input mode which allows the predefined walking gaits to be scrolled through via the keyboard or controller inputs. In effect, this means the controller can be used to “remote-control” the walking of the robot! The walking gaits still need a lot of tuning, but the basic function is now implemented.

    I have updated the CSV spreadsheet for gait data, so that it now includes the 5 possible degrees-of-freedom of each foot (XYZ and Roll/Pitch), the 6 DoF of the base, and the 2 spine joints.

    The walking gait’s updated list of foot target values (first 50 out of 100).

    The foot target values visualised (base and spine joints not shown).

    In Python, all the CSV data is loaded into an array. One of the keyboard/controller inputs can now also be used to update an index, that scrolls forwards/backwards through the array’s rows.

    Next, to get the robot to turn, a second input controls a deflection value which adjusts one of the spine joints and the base orientation (as was mentioned in a past post). The deflection slowly decreases back to 0, if the input is also 0.

    By doing this, the walking gait can be controlled at will by the two inputs, and hopefully make the robot walk and turn. Next comes the fine-tuning and testing!

    All the latest code can be found on the Quadbot17 GitHub project page as usual.

  • Body Moving

    Dimitris Xydas08/26/2018 at 13:53 0 comments

    I have been testing the movement of the robot’s base in the world, while keeping the legs fixed to the ground, as a test of the robot’s stability and flexibility.

    The robot base can now be controlled, either via the GUI, keyboard or gamepad, in the following ways:

    • Translation in XYZ
    • Roll/pitch/yaw
    • Movement of the two spine joints – Front of robot remains still, while rear adjusts
    • Movement of the two spine joints – Front of robot attempts to counteract the motion of the rear

    You may notice the real robot can’t move its upper leg all the way horizontally as the IK might suggest is possible, because there is a small clash between the AX-12 and the metal bracket, but this should be fixed by filing or bending the curved metal tabs:

    Software updates

    I have recently written an OpenCM sketch to control the robot servos, in a way similar to how it was being done with the older Arbotix-M, but this time using the Robotis libraries for communicating with the motors.

    I have also been making various updates to the Python test code, with a few of the main issues being:

    • Improved the code for positioning the base and base target in world
    • Updated base/spine transforms – Front legs now move with base, not first spine joint
    • Fixed the leg IK – Legs now remain in line with ground when the base moves
    • Added new keyboard/joystick input modes for controlling base position, base orientation, spine joints
    • Updated the serial string sending function and fixed some minor issues
    • Moved a load of script configuration variables to a separate Params module
    • Added a combo box to the GUI front-end for loading a selection of CSV files (as an update to the previous two fixed buttons)

    All the latest code can be found on the Quadbot17 GitHub project page as usual.

View all 11 project logs

  • 2
    Hardware Architecture
  • 3

    The kinematics are written in Python/TKInter, and code can be found on GitHub.

    The geometrical drawings were made using the browser-based GeoGebra app.


    Trigonometry based on front view

    Trigonometry based on side view

    Evolution of the Kinematics

    Elliptical path test
    Quad kinematics gait test
    Gait foot pitch test original
    Gait foot pitch test adjusted

View all 7 instructions

Enjoy this project?



Jon Hylands wrote 01/12/2021 at 21:17 point

Hey Dimitris, not sure if you're still working on this, but I have some boards I used to sell for FSR sensors, using an ATmega168. I just got a new set made up for my bioloid quad Roz, but I'll gladly share the PCB files and the firmware if you want them. They are bus-based boards, and plug into the servo bus.

  Are you sure? yes | no

stefane.lemay wrote 01/24/2019 at 20:28 point

A quick comment to let you know your project is awesome and very inspiring!  I'm currently working on a bipedal robot, which I aim to be a walker (dynamic gait, no bent knees).

Since I'm on a budget, I went for the inexpensive xl-320 and opencm 9.04.  I like your usage of the PI along with the opencm.  I'm trying to achieve something like it as well, as I wanted a more powerful platform to do the data processing.

Continue your good work!

  Are you sure? yes | no

B[] wrote 09/12/2018 at 06:37 point

Very awesome project! I was thinking about doing something like this myself, I'm glad there is already somebody making awesome progress!

From watching the twitching of the motors in the video, I would recommend sending motor updates at a higher rate (the Robotis motors should support a baud of 1M, which is about 100-200 updates per second). You can get the timing even better if you do bulk reads. Also if you set your PID right you can aim for future positions - updates should be time based (plot a curve for the motor position and calculate where you expect it to be for a given time).

Another suggestion is if you end up with load issues on the motors and want to keep costs down, try putting a spring or elastic to support the weight of the high load joints (typically knees). The AX12s are only good for about 12kg/cm (at optimal voltage) so if you put more on board you might run into issues.

Just as a hint, you can get very nice feedback from the motors (at least the MX32-64 series), such as load, whether the target angle has been reached, etc. That would certainly give useful information regarding balancing your walking algorithm.

Also, it's worth checking out the SM30s of the Feetech smart motors if you're looking for a low cost similar smart motor.

P.S. My background is humanoid robotics, so in some ways many of these issues are magnified by only having two support legs and being further from the ground!

  Are you sure? yes | no

Dimitris Xydas wrote 09/26/2018 at 21:07 point

Thanks for the tips B[] !

Yes the AX-12 supports 1M baudrate, however the real command speed is limited (there's a good article here: At the moment though I am only sending motor position commands as fast as possible, so I will be making some improvements.

Also been meaning to look at torsion springs for the hip/knee motors (

AX-12 'torque' feedback is actually just the target-actual error value, so not a true torque sensor. You can however play around with the compliance settings.

I've also been looking at ways of adding foot sensors via the AX-12 daisy-chain and not separately, to avoid adding multiple cable bundles, but so far the only existing option I have found is too expensive (

The SM30s look interesting, plus Dynamixel has some newer X servos which are an improvement to the AXs, but I think I'm too far into this project to change motors now (maybe in v2?).

  Are you sure? yes | no

B[] wrote 09/27/2018 at 14:34 point

With the 1M baud-rate, from experience you should be able to update the motors > 100 times a second (this was for dynamixel 20 motors).

Torsion springs do work, just be aware that the walk algorithm might have to change as you're potentially currently making changes to the walk algorithm to compensate for motors that can't reach their target value.

The target-actual error gives "some" information about the load a motor is under. If you know the weight of the robot, the target value, COM and momentum, you can begin to approximate torque pretty well.

Regarding the foot sensors, take a look at the work done by Rhoban: . These guys decided to implement their own foot sensors and they say they actually work better than the Darwin ones. They are some friendly French guys, don't be afraid to reach out to them.

The SM30s were just a suggestion to keep costs low - if money isn't an issue then by all means, Robotis motors are very good.

And yes, you should create a V2 wish list! Make sure your V1 actually reaches a point of being finished!

  Are you sure? yes | no

Dimitris Xydas wrote 09/28/2018 at 17:38 point

Thanks for the further tips, the Rhoban sensors look ideal! I will definitely look into it.

  Are you sure? yes | no

ActualDragon wrote 07/20/2018 at 01:27 point

wow, this is awesome! Just went camping and had an idea. Wouldn't it be funny af to disguise it as a bear and put a camera in the eyes? Then you could walk it around a campground and tip over stuff. XD Might have to be a future project of mine, now if only I was a halfway decent hunter...

  Are you sure? yes | no

Dylan Brophy wrote 07/20/2018 at 01:28 point


  Are you sure? yes | no

Dan DWRobotics wrote 05/12/2018 at 20:00 point

This is extremely cool looking. I imagine it will be quite difficult to program all of the joints but I'm looking forward to seeing it walking around and being generally badass. Looks very mech, Can imagine an enlarged version carrying supplies on an off world mining colony. Have you thought about putting a laser on it somewhere? Just because? 

  Are you sure? yes | no

Dimitris Xydas wrote 05/13/2018 at 15:13 point

Thanks! The difficult part is now creating useful trajectories for the walking gaits, and making walking adaptable to terrain, not just a set sequence.

Laser will hopefully be in the form of a laser scanner. Might also give it a tail!

  Are you sure? yes | no

deʃhipu wrote 05/12/2018 at 19:13 point

Wow, a really nice robot! I think it's the first quadruped I see that has flat feed — are the two extra servos worth it? I can't wait to see how the spine movement adds to the capabilities of the robot.

  Are you sure? yes | no

Dimitris Xydas wrote 05/13/2018 at 15:32 point

Thanks! Most quad robots seem to have point contact feet, I thought I would try adding extra DoF for better foot control. I have also considered replacing the flat base plates with narrower, curved feet if the flat bases prove too restrictive to movement. I'll see how it goes!

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates