Blackbird Bipedal Robot

A low-cost, high-performance bipedal walking robot

Similar projects worth following
Blackbird is an open source, low-cost bipedal robot capable of high resolution force control. It is a research and education platform designed for college and post-grad students interested in studying the advanced field of robotics. Using a novel custom actuator design and supporting software, the system is capable of matching the control requirements of modern research robots for a fraction of the cost by replacing expensive hardware with complex electrical-control solutions. We have implemented multiple algorithms that allow the platform walk, run, spin, and jump based on modern research methods. Most of the design is 3D printed, including the actuators, which allows it to be easily manufactured by students and enthusiasts.


Blackbird is a new bipedal robot that specializes in high bandwidth force control and capable of meeting many of the expected demands for legged robots. As a result of its motor design, the robot can use virtual control models that allow it  to replicate the compliant properties of springs, dampers, and torque sensing hardware without any extra costs (just math). The reduced mechanical complexity gives the robot a minimal design capable of high quality locomotion, perfect for research labs, college classrooms, and rapid prototyping in industry.


We designed the OpenTorque Actuator specifically for high torque scenarios that rely on high backdrivability; a key criteria for effective legged locomotion. Rather than relying on expensive sensors for measuring motor torque, the system relies on its quasi-direct-drive nature to create high resolution estimates of torque. This proprioception keeps the price of the actuator low, without sacrificing torque sensing capabilities, and also boosts the effective bandwidth of the actuator (allowing it to react faster). 

Regarding the walking problem, we have developed two walking controllers for the robot. The first one is a reduced dynamics model based on the ATRIAS and CASSIE passive dynamics. Mathematically, this control method works well for environments with stairs and multiple unstable terrains. The second controller is a Model Predictive Controller (MPC) that can be used for more complicated maneuvers, such as jumping and walking while spinning. MPC relies on online optimization techniques to find the best force distribution that meets the desired user

Blackbird stands 1.2 meters tall and weighs roughly 15 kg. Its estimated operation time is 2.5 hours with a 400 Wh battery. The total BoM cost is less than $3000, which is significantly less than any human sized bipedal robot capable of high quality force control. For reference, a single HEBI X-Series Actuator, which uses Series Elastic Technology, is about $3000 and is close to equivalent in force sensing capabilities


OpenTorque has a control bandwidth of approximately 30 Hz (3x-4x greater than that of a human) and is larger than many legged robots in the research community. For comparison, some of the best quadruped robots: ANYmal (SEA), HyQ (hydraulic), and StarlETH (SEA) actuators have maximum force control bandwidths of 60 Hz, 20 Hz, and 25 Hz respectively [higher is better] but are exclusive research tools and cost $X0,000+.

Each of Blackbird's legs has 5 Degrees of Freedom (DOF), totaling to 10 actuated DOF for walking. The system is capable of walking, running, spinning, and jumping as a direct result of the actuation hardware and control schemes used. We have also verified it's ability to operate in common human environments by executing maneuvers such as climbing stairs. Testing shows that the robot is capable of maintaining its balance even after receiving a 20 Pound (lbf) kick to the body. 


Since the robot is too inexpensive for industry settings, we hope that this platform can be used by students, researchers, and professionals interested in legged robot development to learn, test, and verify different control schemes or software stacks. We believe that hobbyists and researchers can hugely benefit from access to a low-cost, highly capable biped platform like the Blackbird, especially since there are no other bipedal robots on the market capable of equivalent force control per dollar. Locomotion is first and foremost a design and control problem long before it is a vision or AI based one. We hope that Blackbird helps bridge the locomotion gap by providing an effective platform for engineers and computer scientists. 

We have also begun working Blackbird 2, which will use a new patent-pending motor design and more robust materials (injected molded plastic and aluminum) for a stronger frame...

Read more »


CAD of the whole assembly for 3D printing

step - 23.35 MB - 09/30/2019 at 09:07


Blackbird URDF model

Zip Archive - 4.68 MB - 09/30/2019 at 08:08


View all 9 components

  • Blackbird 2 CAD Sketching

    Nathan09/27/2019 at 09:16 8 comments

    With the end of the year coming around, we thought it would be fitting to share some CAD outlines for a new blackbird model. While we haven't officially started working on Blackbird 2, we wanted to show some very very very rough outlines for the future designs being considered. Blackbird 1 was nice because we really put a focus on 3D printed parts, modularity, and commercial off the shelf (COTS) products. 

    Blackbird 2 will be a far more customized solution, pursuing a more "Design for Manufacurability." It will be composed of easily accessible and manufacturable materials such as molded plastics, carbon fiber, and aluminum. We are also currently working on an entirely custom backdrivable pancake motor solution that will be used for extremely high torque and repeatability (which will outperform OpenTorque in almost every way for a small price increase). These new motors will be the bread and butter of the new design, allowing it to do highly dynamic maneuvers like jumping. Since control is the heart and soul for locomotion, the new design will focus more on lowering the leg inertia and moving the center of mass to a more controllable position.

    The open hip design (above) is probably our leading design at the current moment due to it's flexibility and high degree of freedom. By keeping an open design, each leg has an insane ab/ad range and can yaw full 360 if desired. The design stacks all three of it's leg motors (hip, knee, and foot) on the hip of the robot and will rely on wiring/belts to transmit torque to the appropriate joints. This reduces the inertia of the leg and allows for better leg trajectory tracking. Some downsides of the design is the location of the center of mass, which will be higher from the ground because the legs are mounted underneath. Depending on the strength and weight of the legs, the simple block of support above the motors may not be enough to prevent the body from twisting/bending. 

    A second closed hip design was inspired by the design of Atlas, by Boston Dynamics. It's essentially the same things as the open hip design, except with a fully enclosed body to help support the motors. Center of mass will be lower, since the battery could be mounted lower on the body, which is better for balance. The hips are mounted off-axis from the yaw joint, which gives the foot an arched trajectory while yawing. This, may be an advantage because it could contribute to the stride phase while walking and accelerate the body faster. Ab/ad is smaller compared to the open hip design, and can only reach to roughly +-60 degrees before making contact with the body. The thought was that we could also integrate small motors into the foot to handle its smaller loads. If the integrated motor design doesn't work out, the motor can get stacked at the hip and transferred to the foot in a similar fashion to the open hip design. 

    The next major design steps is going to be refining the transmission system in the legs and conducting FEA analysis on the joints/body. Since there may be some material and linkage based non-linearities in the Free Body Diagrams, a fair bit of the analysis will come down to physical testing. 

  • A New Optimal Controller

    Nathan09/22/2019 at 05:47 2 comments

    What's better than one way to control a biped? That's right, two ways to control it... and a longish log post to summarize the top level of the new one. 

    We've recently created an optimal Model Predictive Controller (MPC) that can also be used to control Blackbird. Now the robot can walk compliantly using SLIP dynamics or Floating body dynamics. These two options have led us to some interesting results and while I can't compare them at the moment, I wanted to show off the new controller we developed. This log will cover the super high level concepts of the control system. That all being said, what did we do, why did we do it, and how did we do it?  Here are some of the basic sim result videos, but we are holding off on showing the walking of the physical robot with this controller for right now. There are still some little kinks to work out.

    We did some disturbance testing and basic walking tests in order to verify that the controller works well. You can watch the video at the top of the log to see what they looked like visually. Basic walking testing was done to make sure the controller worked at reasonable speeds (Fig 1), The second major test we did was exerting a 100N force on the robot from the front to make sure it would keep it's balance (Fig 2).  You can see that the robot is able to recover after about 1 meter of travel, but does not fall over. 

    Fig 1. Body following a desired speed.

    Fig. 2 Response to a force at the body.

    One of the main inspirations that brought this project into being was the phenomenal work by Jonathan Hurst, Christian Hubicki, and Mikhail Jones on the CASSIE and ATRIAS robots. They helped pioneer modern compliant locomotion using reduced SLIP models for bipeds to mimic human dynamics. Offshoots of this research has led to some interesting results using hybrid and optimal solutions from Michigan Robotics and AMBER Lab in Caltech respectively. 

    On the other side of the spectrum you have incredibly dynamic multi-legged robots such as MIT Cheetah 3/Mini (Songbae Kim, Ben Katz, Gerardo Beldt, and Jared Di Carlo) or ETH Zurich's ANYmal (by ANYbotics). We've also had some long chats with the folks over at Boston Dynamics and how they use Model Predictive Control for both Atlas and SpotMini. Who have been bringing on new formulations for simple and effective multi-layered Model Predictive Control with computed torque control. 

    So first of all. What is Model Predictive Control? For those that don't know (I wouldn't be surprised because this is College Graduate Level Math), MPC is a form of Optimal Control, where you basically tie an optimizer to a robust closed loop control system to determine your outputs. We love it because it can control your plant (robot) really really well if you can characterize it's dynamics properly. 

    Image result for model predictive control

    We pose the dynamics of the robot in two stages. The body of the robot just acts as a floating mass a distance from the ground. The MPC is only responsible for optimizing the motion of the state of the body mass in space based on the force each foot exerts on the ground. Basically the mathematical model boils down this: the feet are at a specific place based on the walking gait and have to exert a force that keeps the body balanced (counteracting all that inertial force from the body) and moving based on the desired body trajectory. Based on the desired forces that the optimizer determines the feet have to exert, you can do the Jacobian math to transform the foot force into leg joint torques. Note that this prediction only works if your leg's are light and have little inertia, because we just abstract them away to simplify the model.

    The second layer of the robot is the leg swing dynamics. Hooray for more force control!!! The MPC is only used when the legs are on the ground. When the legs are in swing, we predict the leg dynamics  (feedforward control PID control) using inverse dynamics in operational...

    Read more »

  • Testing the trajectory tracking of the legs

    Nathan09/22/2019 at 04:05 0 comments

    It has been a few months since there has been an update for this page. Over the last few months there has been a ton of work finalizing the physical robot and improving the overall control code. Now that everything is all wired up on the robot, we can start looking at the tracking performance of the legs during the swing phase of walking. We looked around at a few different methods for generating trajectories. You get a preview of it walking in the air while we were testing the trajectories.

    We decided to just settle with a stable polynomial spline trajectory. There are many types of splines out there, such as cubic, quintic, clothoids, etc., but a simple cubic spline was the best fit because we wanted basic acceleration motion profiling for the inverse kinematics and dynamics of the leg. Below is an example of a a basic cubic spline interpolating a few data points (This is not that actual trajectory of the foot, just an example). 

    figure_1.pngFor those that don't know cubic splines (red) have pretty bad position tracking if the distances between knots (X's). Our trajectories are getting calculated online based on the swing distance needed to walk at the desired speed, so cubic spline inaccuracies shouldn't be an issue because we can generate plenty of knots to interpolate during the swing trajectory. 

    Since the robot also has a concept of force control, the trajectory only acts as soft guideline for  the feet. If an object interferes with the foot before expected, the proprioception from the QDD's will notice and allow the robot to preemptively finish it's trajectory and act compliantly with the object disturbing it. 

  • Mechanical build

    Gabrael Levine04/30/2019 at 06:10 0 comments

    Over the last few weeks, I've been iterating on the design and making parts. The Blackbird is now close to being ready to walk. 

    The few remaining challenges to be solved have to do with the electronics, mainly wiring up the Raspberry Pi and synchronizing all the motors over CAN bus. I've chosen to use five MCP2515 CAN boards connected to the Raspberry Pi over SPI. Each of the five CAN channels will connect to a single ODrive board, ensuring the 1000 Hz control frequency required for closed-loop torque control. 

    Special thanks to ODrive Robotics for sponsoring this project. 

    More updates coming once it walks (hopefully within the next couple weeks!)

  • More controller improvements

    Gabrael Levine01/04/2019 at 07:06 0 comments

    The Blackbird has a couple new tricks. It can now turn around and walk over stairs. 

    I still have more features to add (for example, the ability to stand still), but once that's done I'll publish the source code for the controller. 

  • Balance Control

    Gabrael Levine12/17/2018 at 23:55 0 comments

    I've implemented PD balancing control for the torso. Previously there were prismatic constraints keeping the torso parallel to the ground, but now the robot is entirely self-balancing. 

    To prevent the feet from slipping, the controller limits the balancing torque according to the current axial force on the leg. When the axial leg force is small (such as immediately after touchdown or right before liftoff), the controller knows not to apply too much balancing torque to that leg. 

    Conversely, during the double support phase (both legs on the ground), the controller evenly distributes the balancing torque between both legs. This keeps it from simultaneously applying full balancing torque to both legs, which would cause the robot to "overbalance" and fall over. 

  • Walking control part 3

    Gabrael Levine12/07/2018 at 05:07 0 comments

    I've added the full-order Blackbird model to the PyBullet simulation. 

    It's running the same SLIP controller as before, but the Blackbird's torso and feet are constrained to the positions of the SLIP walker's torso and feet. 

    This means the assumptions of the SLIP model are no longer true (the legs are no longer massless), yet it performs admirably with no modifications. I believe this proves something about the inherent stability of my robot/controller design. 

  • Walking control part 2

    Gabrael Levine12/06/2018 at 01:08 0 comments

    Here you can see the improved controller in action. It's able to walk in 3D (in this case following a circular path) and reject large disturbances. Still to be added: torso balancing, transitions between walking/standing, and yaw control. 

  • Walking Control

    Gabrael Levine11/25/2018 at 04:31 1 comment

    Prior to building the robot, I'm developing the controls on a reduced-order SLIP model. The robot's legs are modeled as massless, springy linear actuators on revolute joints. All the mass is located at the hip. 

    The controller here was inspired by this paper

    This controller has two basic parts:

    State-based control: Switches each leg between the stance and swing controllers according to whether the foot is in contact with the ground. 

    Time-based control: Lifts the legs and performs energy injection according to a repeating timer. The timers for the two legs are 180 degrees out of phase with one another.

    The stance controller holds the leg at a constant length while allowing it to swing freely, while the swing controller rotates the leg to match the desired touchdown angle. The touchdown angle is proportional to the difference between the desired and current velocity. 

    Energy injection is accomplished by temporarily increasing the leg spring constant during the second half of the stance phase, thereby producing extra force in the direction of motion. 

    Next I'll extend the controller to work in 3D (as it's currently confined to the sagittal plane) and add balance control for the torso. 

  • Leg testing

    Gabrael Levine09/24/2018 at 06:43 0 comments

    I'm running some sinusoidal trajectories on the leg to simulate different modes of locomotion such as walking and jumping. The actual controller will be far more complicated of course, but this gives me the opportunity to check the range of motion. 

    I also put the leg right-side-up on the ground to see how much current it draws when standing. The results were promising -- only 15A per actuator. Or in other words, only 37 watts for the entire leg. It seems highly likely that I'll be able to hit the goal of 2 hours of operation on a 500Wh battery. (The real robot will need to support more weight, like the battery and yaw/roll actuators, but this will be mitigated by the new actuator I'm designing that features a 50% higher gear ratio and better cooling.)

View all 12 project logs

Enjoy this project?



casainho wrote 10/02/2019 at 14:43 point

If this project is OpenSource, where are the software sources? - because I was searching a lot and I could not found them. Thanks.

  Are you sure? yes | no

katayoun.hatefi wrote 08/08/2019 at 19:36 point

hi  is it possible to upload the code for jumping robot ? thanks in advance 

  Are you sure? yes | no

Ben Bokser wrote 08/05/2019 at 00:45 point

Hi! Great project. I'm curious, why did you replace the four bar linkages with belts?

  Are you sure? yes | no

Ben Bokser wrote 08/05/2019 at 03:05 point

Was it radial play in the bearings? Positioning issues because of the CF tubes?

  Are you sure? yes | no

erwin.coumans wrote 06/19/2019 at 22:37 point

Very exciting project! If you run into issues with PyBullet let me know. It would be nice to include your simulation as an example in PyBullet.

  Are you sure? yes | no

Pedro Morais wrote 04/13/2019 at 04:42 point

Are you simulating motor dynamics in your videos? Great job by the way. 

  Are you sure? yes | no

Nathan wrote 09/22/2019 at 08:34 point

There are some very soft motor dynamic considerations (i.e acceleration ramp), but we are not simulating the full motor dynamics (i.e something like cogging torque).

  Are you sure? yes | no

Xie Zhaoming wrote 01/08/2019 at 18:26 point

Nice work! Can't wait to see actual robot walking.

  Are you sure? yes | no

Darren V Levine wrote 01/02/2019 at 22:55 point

Really impressive work! I'm working on a very similar robot and am learning a ton from reading up on your work. Looking forward to seeing your updates!

  Are you sure? yes | no

Charles Blouin wrote 12/24/2018 at 21:14 point

That project looks really great! I am doing some machine learning experiment with Pybullet where I try to control a robot that is using servo motors instead of those awesome (but expensive) backdriving capable motors. Would you consider sharing the pybullet model? I would love to run an AI algorithm on your robot! There has been some very successful control of complex legged robots with joint torque control using a neural network such as this: . The advantage of those neural controller is that they can be trained to be very resilient to external disturbances.

  Are you sure? yes | no

Gabrael Levine wrote 12/25/2018 at 08:09 point

I uploaded the URDF to the files section. 

It would be awesome if you can get AI walking control to work. I tried reinforcement learning (OpenAI PPO) and neuroevolution (ESTool) and I couldn't get either to work reliably. It learned how to take a few steps, but it was very unstable. 

I plan to revisit AI control in the future, but combine it with the SLIP controller to get the best of both worlds. I think the reduced dimensionality of the SLIP model would help it learn much better/faster. 

  Are you sure? yes | no

Songyan wrote 12/25/2018 at 11:56 point

People used to use look-up table (or multivariate polynomial) to record the SLIP model behavior, the other way is to use neural network. I tried something similar, actually it works, with lower dimensional model, it works pretty well. I am trying to implement it in pybullet. Maybe then move directly to your blackbird since you already uploaded your urdf model.  

  Are you sure? yes | no

Charles Blouin wrote 12/31/2018 at 17:47 point

Thank you! I will report back if I am having any success. I am working on a simpler robot controlled with RL first, a two wheel balancer (Balboa 32U4). There are a lot of papers using RL for robot control in simulation, but few use actual robots. You might be interested in the code I wrote for motor simulation here: It is based on an ideal motor model with viscous and load friction. I wrote in the comments how the friction and motor constants can be obtained with two simple tests. I also implemented a delayed response (according to some papers, the NN does not transfer from simulation to reality if the delay is not taken into account).

I may be easier use a neural network for control later if you can use a Raspberry Pi to control of your robot,. Tensorflow now has packages for Raspbian. I wrote a small script to export the NN weights trained with Open AI PPO ( They can then be reimported as a text file on the RPi and the NN can run with TF in a relatively straight forward manner.

Good luck with your project!

  Are you sure? yes | no

merck.ding wrote 09/11/2018 at 11:37 point


  Are you sure? yes | no

Paul Crouch wrote 09/08/2018 at 20:59 point

Loving this, reminiscent of Cassie (minus the series-elastic components). Watching with interest.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates