Reachy: Open Source Humanoid Robot

A capable robotics platform to explore Computer Vision, Machine Learning, and a 6dof grasping arm.

Public Chat
Similar projects worth following
Reachy is an Open Source Humanoid Robot by Pollen Robotics. Mechlabs at Circuit Launch was the first group in the world to successfully build the robot outside of its original designers. All of our work is being documented publicly and under the same open source license. CC-BY-SA 4.0

The Robotics CoLab at Circuit Launch undertook building Reachy in the Fall 12 week session.

This open source robotics platform is available from Pollen Robotics and CoLab undertook building it from scratch. There were lots of challenges as Reachy is very new to the world and there isn't much documentation or community that has contributed to the project.

Our cohort of students, many who have never worked on a robotics project before succeeded in building Reachy and getting it to wave hello.


  •    Designed for SLS 3D printing, a lot of redesign needed to take place to make it in standard FDM 3D printing PLA.
  •    Electronics run using a Microservices Architecture called Luos rather than traditional widely adopted microcontrollers.
  •    No one outside Pollen Robotics had fully built Reachy before we attempted this.

We've published many of our notes, BOM, and documentation that we built along the way to contribute to the community as we begin our next CoLab session exploring Robot Human Interaction Design

Reachy CoLab Project Page

  • Robot Concierge Demo

    Robotics CoLab04/08/2021 at 17:56 0 comments

    Our winter session of Robotics CoLab set out to explore ROS, and Human Robot Interaction. They developed a fully functional Concierge Robot application for our Reachy build.

    Winter 2021 CoLab program goals:

    • Implement ROS (Robot Operating System) control of Reachy.
    • Create a mobile concierge robot:
      • Integrate the Ubiquity MAGNI base
      • Design and implement a pan tilt neck mechanism
      • Implement ROS as the control system
      • Perform Mask Detection
      • Implement emotive gestures
      • Create a conversational voice interface

  • Teaching Reachy to Pour H2O

    Robotics CoLab01/07/2021 at 04:13 0 comments

    The first part of the program tracks the trajectory of each motor and the second part of the program re-plays the motion.

    Video here

  • 12 weeks and it's ALIVE!

    Robotics CoLab01/05/2021 at 18:06 0 comments

    Twelve weeks. Twelve students. Four countries. Three time zones.

    And REACHY the open source humanoid robot  is ALIVE!

    Say hello to Reachy!

    For Circuit Launch, the Robotics CoLab Fall session was the second iteration of an experimental approach to robotics education which combined project based learning with a collaborative methodology (you can read more about the first phase of the program in this blog post). 

    The Fall session has been especially exciting, but the project was not without its challenges!

    Working collaboratively on an open source robotics project definitely has its pros and cons. There was an established path to success, forged by Pollen Robotics with Reachy. With such an innovative design and great functionality (voice recognition, computer vision, grippers, orbita joint, expressive antennae etc), there was enormous potential and great scope for exploration. However there were some limitations: being able to build specific parts in a cost effective way and a steep learning curve with the control system. 

    We still had some blockers to overcome. 

    As well as redesigning 3D printed parts for FDM rather than SLS and experiments using tough resin in the SLA printer, the students also spent time and effort to learn how the new LUOS boards worked to control the motors. LUOS is a "microservices architecture" that is also open source and is used in place of more common microcontrollers like Arduino.

    The process of troubleshooting also came to a head (pun intended) when operating the Orbita prototype (the neck joint) with the weight causing Reachy's head to bang like an Ironmaiden fan! 

    For the software team, the process of learning, testing and iterating while working remotely was aided by Pollen's Reachy simulator which allowed for risk free testing of code. And for live testing, the remote test bench with assistance from students in the Circuit Launch Lab provided more opportunity for experimentation.

    An additional challenge during a global pandemic was how to create a meaningful experience for the virtual students on such a technically complex project. Remote team members in different time zones working closely with team members in the Lab faced an uphill battle to ensure they stayed on the same page. 

    The key was communication. And giving apps like Slack, Notion, and of course Zoom a real workout. And interestingly, placing multiple cameras in the Lab while everyone worked really helped.

    With perseverance and enthusiasm come rich rewards, and in the end, the students achieved their north star goal: activation of Reachy's voice recognition capability, Reachy's verbal response and the "hello" motion:


    So what's next? 

    For the Winter Robotics CoLab program kicking off 12 January 2021 we want to build the second arm, complete the ORBITA joint and explore the possibilities of what Reachy can do with a focus on robot-human interaction design. 

    What could YOU achieve with Reachy in 12 weeks? 

    Applications close midnight on Sunday 3 January 2021. Apply now:

    Fall CoLab student Logan assembling Reachy's arm in the Lab at Circuit Launch
    Fall CoLab student Logan assembling Reachy's arm in the Lab at Circuit Launch

View all 3 project logs

Enjoy this project?



Carson James Cannon wrote 02/06/2021 at 07:21 point

hey, what's up I'm a factory worker with no remote work options. I just bought a AGV on eBay in hopes of leasing it to my employer as a way to save enough money to buy a reachy with the virtual reality telerobotics app or build. I believe this would have been extremely useful and still could be for the virus situation and the threat of automation. My hope is to collaborate to mass manufacture these things and drive the cost down. Make something as affordable as a pickup truck. Once in use collect the motion control data from remote work operation over VR to train a neural network to eventual general artificial intelligence.


  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates