1. 20/09/2021 - A beginning

We are on the cusp of a very exciting time in robotics.

A step change in the capabilities of complex robots enabled by the convergence of the many diverse technologies required to produce embodied, intelligent behaviour in unstructured, dynamic environments.

Some like 3d printing are relatively mature, others like AL and ML techniques applied to physics based simulations are only just emerging from labs and PhD theses.

Still others are becoming smart products with advance vison processing now easily accessible on consumer level devices like the Oak d-lite AI camera

in this project, I present the results of over two decades of research in humanoid robotics and offer the possibility that an engaged open source community can deliver a working humanoid robot faster than Tesla :-)

This robot differs from the Teslabot in three main ways:

1. It is real - every part has already been physically prototyped, so whilst it is also a never ending work in progress, working plans are available immediately for those that wish to build their own.
2. It is open source - robots that can truly enter our daily lives shouldn't be entrusted to corporations, see every robot apocalypse movie ever.
3. It has wheels at the end of its legs - because this is just sensible design for human centric environments.

Like Tesla, we work closely with Nvidia for the development of the AI and other control software but how it will be applied will be up to the builders.

Non-industrial robotics have always been hampered by the explosion in the difficulty of control that arises from adding just a few more degrees of freedom (DOF) or operating in an unstructured environment (industrial robots avoid this by keeping their designs, environments and tasks relatively simple with largely fixed base designs).

Thus a crucial part of this project is the development of a toolchain that spans the design, manufacture, simulation and resulting control system that can be put back in the real world on to a real robot.

All the software used is open source and free, primarily using Onshape for mechanical design, CURA or similar for 3d printing and nvidia omniverse for simulation and control.

Of course real hardware still costs real money for computers, gearmotors, sensors, batteries, 3d printers and material, etc so the robot is made of independent modular parts.

Whilst the humanoid form is arguably the best adapted to work within human centric environments a full humanoid is excessive for the majority of tasks that non-industrial robots will be able to perform in the near future.

There are still advantages though to have the robots tackle tasks in the same way as a human. For example, to facilitate the generation of essential training data and to give an intuitive understanding of how safely and efficiently the robot is performing.

With a modular design it is relatively straight-forward to create a robot that can perform a task like a human only where needed greatly increasing the cost-effectiveness.

Even so, the industrial servo drives used in the arms and base of this robot are not cheap and it is anticipated that sufficient interest to justify a crowd-sourced volume order of motors of 1000 pieces for 250 arms, will be a necessary step towards producing a meaningful number of robots in the real world.

The advantge of a fully simulated robot is that it is possible to experiment with and train a variety of robot solutions before incurring the cost of building a real system. Making this available in the Nvidia Omniverse and Isaac robot simulators is one of our next major milestones.

Progress to date:

The most finished part so far, is the hand which has been developed as a "hello world" project called the robotnanohand:

This version of the hand uses an Nvidia Jetson Nano computer to perform AI visual recognition "at the edge". It is built with standard rc servos that cost only 5 bucks each making it an economical way to get started.

The software for the hand uses existing models to recognise hand gestures and then move the fingers in the robot hand to copy them. It is supported by Silicon Highway, Codethink and Nvidia with the aim of providing an introduction to the world of AI driven humanoid robotics.

The mechanical work for the arm is also completed and can be driven manually by waldo or automatically by code:

It is now undergoing endurance testing which you can see being live streamed should you so wish on @therobotstudio on YouTube.

We are now actively working on completing the simulation toolchain with the Nvidia visual simulation group in Zurich with the target of completely controlling the arm both as a stand alone unit and when mounted to a moving base.

The moving base in question has also been successfully prototyped and can be seen balancing 

and standing-up:

Within the next few weeks a pair of arms will be attached to this base and the first complete humanoid of this type assembled - stay tuned for photos and videos!

The process of uploading all the CAD, stl's, microprocessor code, etc is ongoing please show your encouragement by liking this project!

In the files section are all the stl files required to print the hands and arm. Detailed instructions on how to assemble the hands are given by the robotnanohand page if you really can't wait to produce arms as well get in touch directly to be part of the beta program.

The robot revolution is almost upon us, if you want to be part of making it happen, time to get involved!