Close
0%
0%

Blackbird Bipedal Robot

A lightweight and efficient human-scale bipedal robot.

Similar projects worth following
The aim of this project is to build an underactuated biped based on the spring-mass model -- similar to Agility Robotics' Cassie and its predecessor ATRIAS.

This robot differs from other spring-mass-based bipeds in the type of actuators used -- specifically, quasi-direct-drive rather than series-elastic. This allows the "springs"  to be implemented in software rather than hardware, simplifying the design and reducing cost. 

This is the actuator I designed for use in this project: OpenTorque Actuator

Blackbird stands 1.2 meters tall and weighs roughly 15 kg. Its estimated operation time is 4 hours with a 400 Wh battery. The total BoM cost is less than $3000. 

Once the prototype Blackbird is built and walking (estimated February 2019), I'll open-source the design so anyone can build or modify it. I believe that hobbyists and researchers can hugely benefit from access to a low-cost, highly capable biped platform like the Blackbird. 

Some reference materials I'm using:

Zip Archive - 145.57 kB - 12/25/2018 at 07:43

Download

  • Mechanical build

    Gabrael Levine04/30/2019 at 06:10 0 comments

    Over the last few weeks, I've been iterating on the design and making parts. The Blackbird is now close to being ready to walk. 

    The few remaining challenges to be solved have to do with the electronics, mainly wiring up the Raspberry Pi and synchronizing all the motors over CAN bus. I've chosen to use five MCP2515 CAN boards connected to the Raspberry Pi over SPI. Each of the five CAN channels will connect to a single ODrive board, ensuring the 1000 Hz control frequency required for closed-loop torque control. 

    Special thanks to ODrive Robotics for sponsoring this project. 

    More updates coming once it walks (hopefully within the next couple weeks!)

  • More controller improvements

    Gabrael Levine01/04/2019 at 07:06 0 comments

    The Blackbird has a couple new tricks. It can now turn around and walk over stairs. 

    I still have more features to add (for example, the ability to stand still), but once that's done I'll publish the source code for the controller. 

  • Balance Control

    Gabrael Levine12/17/2018 at 23:55 0 comments

    I've implemented PD balancing control for the torso. Previously there were prismatic constraints keeping the torso parallel to the ground, but now the robot is entirely self-balancing. 

    To prevent the feet from slipping, the controller limits the balancing torque according to the current axial force on the leg. When the axial leg force is small (such as immediately after touchdown or right before liftoff), the controller knows not to apply too much balancing torque to that leg. 

    Conversely, during the double support phase (both legs on the ground), the controller evenly distributes the balancing torque between both legs. This keeps it from simultaneously applying full balancing torque to both legs, which would cause the robot to "overbalance" and fall over. 

  • Walking control part 3

    Gabrael Levine12/07/2018 at 05:07 0 comments

    I've added the full-order Blackbird model to the PyBullet simulation. 

    It's running the same SLIP controller as before, but the Blackbird's torso and feet are constrained to the positions of the SLIP walker's torso and feet. 

    This means the assumptions of the SLIP model are no longer true (the legs are no longer massless), yet it performs admirably with no modifications. I believe this proves something about the inherent stability of my robot/controller design. 

  • Walking control part 2

    Gabrael Levine12/06/2018 at 01:08 0 comments

    Here you can see the improved controller in action. It's able to walk in 3D (in this case following a circular path) and reject large disturbances. Still to be added: torso balancing, transitions between walking/standing, and yaw control. 

  • Walking Control

    Gabrael Levine11/25/2018 at 04:31 1 comment

    Prior to building the robot, I'm developing the controls on a reduced-order SLIP model. The robot's legs are modeled as massless, springy linear actuators on revolute joints. All the mass is located at the hip. 

    The controller here was inspired by this paper

    This controller has two basic parts:

    State-based control: Switches each leg between the stance and swing controllers according to whether the foot is in contact with the ground. 

    Time-based control: Lifts the legs and performs energy injection according to a repeating timer. The timers for the two legs are 180 degrees out of phase with one another.

    The stance controller holds the leg at a constant length while allowing it to swing freely, while the swing controller rotates the leg to match the desired touchdown angle. The touchdown angle is proportional to the difference between the desired and current velocity. 

    Energy injection is accomplished by temporarily increasing the leg spring constant during the second half of the stance phase, thereby producing extra force in the direction of motion. 

    Next I'll extend the controller to work in 3D (as it's currently confined to the sagittal plane) and add balance control for the torso. 

  • Leg testing

    Gabrael Levine09/24/2018 at 06:43 0 comments

    I'm running some sinusoidal trajectories on the leg to simulate different modes of locomotion such as walking and jumping. The actual controller will be far more complicated of course, but this gives me the opportunity to check the range of motion. 

    I also put the leg right-side-up on the ground to see how much current it draws when standing. The results were promising -- only 15A per actuator. Or in other words, only 37 watts for the entire leg. It seems highly likely that I'll be able to hit the goal of 2 hours of operation on a 500Wh battery. (The real robot will need to support more weight, like the battery and yaw/roll actuators, but this will be mitigated by the new actuator I'm designing that features a 50% higher gear ratio and better cooling.)

  • Leg Operational

    Gabrael Levine09/21/2018 at 01:56 0 comments

    Quick update: I finished wiring up the encoders and got the leg working. More to come soon.

  • First Leg Complete

    Gabrael Levine09/07/2018 at 00:26 0 comments

    I finished assembling the first leg for the prototype Blackbird robot. It's made from 2 OpenTorque actuators, some carbon fiber tubes, and a few printed parts. (The parts were printed out of PLA but will be redone in NylonX for the final version.) 

    Since this is a simplified prototype, there's no ankle joint. Instead it has a point-contact foot. This makes it a better approximation of the spring-loaded inverted pendulum (SLIP) model, which makes developing the controls easier, but it'll need an ankle joint to be able to stand still. Until the ankle joint is added, I can use dynamic standing (see the video below).

    Next up: building a test rig from V-slot extrusions and seeing how high the leg can jump. 

View all 9 project logs

Enjoy this project?

Share

Discussions

erwin.coumans wrote 5 days ago point

Very exciting project! If you run into issues with PyBullet let me know. It would be nice to include your simulation as an example in PyBullet.

  Are you sure? yes | no

Pedro Morais wrote 04/13/2019 at 04:42 point

Are you simulating motor dynamics in your videos? Great job by the way. 

  Are you sure? yes | no

Xie Zhaoming wrote 01/08/2019 at 18:26 point

Nice work! Can't wait to see actual robot walking.

  Are you sure? yes | no

Darren V Levine wrote 01/02/2019 at 22:55 point

Really impressive work! I'm working on a very similar robot https://hackaday.io/project/163093-tiptap and am learning a ton from reading up on your work. Looking forward to seeing your updates!

  Are you sure? yes | no

Charles Blouin wrote 12/24/2018 at 21:14 point

That project looks really great! I am doing some machine learning experiment with Pybullet where I try to control a robot that is using servo motors instead of those awesome (but expensive) backdriving capable motors. Would you consider sharing the pybullet model? I would love to run an AI algorithm on your robot! There has been some very successful control of complex legged robots with joint torque control using a neural network such as this: https://www.youtube.com/watch?v=__ilQzkPDNI . The advantage of those neural controller is that they can be trained to be very resilient to external disturbances.

  Are you sure? yes | no

Gabrael Levine wrote 12/25/2018 at 08:09 point

I uploaded the URDF to the files section. 

It would be awesome if you can get AI walking control to work. I tried reinforcement learning (OpenAI PPO) and neuroevolution (ESTool) and I couldn't get either to work reliably. It learned how to take a few steps, but it was very unstable. 

I plan to revisit AI control in the future, but combine it with the SLIP controller to get the best of both worlds. I think the reduced dimensionality of the SLIP model would help it learn much better/faster. 

  Are you sure? yes | no

Songyan wrote 12/25/2018 at 11:56 point

People used to use look-up table (or multivariate polynomial) to record the SLIP model behavior, the other way is to use neural network. I tried something similar, actually it works, with lower dimensional model, it works pretty well. I am trying to implement it in pybullet. Maybe then move directly to your blackbird since you already uploaded your urdf model.  

  Are you sure? yes | no

Charles Blouin wrote 12/31/2018 at 17:47 point

Thank you! I will report back if I am having any success. I am working on a simpler robot controlled with RL first, a two wheel balancer (Balboa 32U4). There are a lot of papers using RL for robot control in simulation, but few use actual robots. You might be interested in the code I wrote for motor simulation here: https://github.com/charles-blouin/servorobots/blob/master/servorobots/components/test_motor.py. It is based on an ideal motor model with viscous and load friction. I wrote in the comments how the friction and motor constants can be obtained with two simple tests. I also implemented a delayed response (according to some papers, the NN does not transfer from simulation to reality if the delay is not taken into account).

I may be easier use a neural network for control later if you can use a Raspberry Pi to control of your robot,. Tensorflow now has packages for Raspbian. I wrote a small script to export the NN weights trained with Open AI PPO (https://github.com/charles-blouin/servorobots/blob/master/servorobots/tools/export_weights.py). They can then be reimported as a text file on the RPi and the NN can run with TF in a relatively straight forward manner.

Good luck with your project!

  Are you sure? yes | no

merck.ding wrote 09/11/2018 at 11:37 point

这是一个最喜欢的想法,我也想做一个,我打印OpenTorque执行器,电机驱动板。

  Are you sure? yes | no

Paul Crouch wrote 09/08/2018 at 20:59 point

Loving this, reminiscent of Cassie (minus the series-elastic components). Watching with interest.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates