Unfortunately, I did not know about the hackaday Robotics Module Challenge beforehand this year, so I missed the submission deadline. However, I believe my current robot build has a number of current and planned features, which make it eligible for the Human Computer Interface Challenge. Please read the Details section and Logs for more information!
This quadruped robot was born out of a learning exercise in Autodesk Fusion 360. The robot hardware is mostly complete, with legs and chassis designed, built and assembled. A Python test program currently solves the inverse kinematics of the leg and base. Some simple walking gaits have been implemented. Future work will focus on finishing the hardware, building more complicated walking routines and mounting a 3D sensor for environmental perception and SLAM.
In addition to trying to provide this robot build with autonomous behaviour (e.g. SLAM navigation), another objective is to explore various innovative ways of controlling its moving and walking behaviour, and its interaction with the user. The walking and steering motion of the robot can already be controlled via the GUI, keyboard or XBox One game controller.
So I am currently working on the improvement of the current user-controlled functions, as well as the implementation of some interesting new ones:
Fine-tune input for controlling the walking and steering motion of the robot.
Exploring interesting ways that a robot 'tail' can interact with the user.
The CAD model went through a couple of iterations before deciding on the final form: from 8 polygonal sections to six smoother and smaller sections.
When 3D printing, the base section went through a re-design, since the original idea of just gluing the small section to the base was clearly not going to provide enough stability. I also removed yet another link, to avoid having the tail ending up too big and heavy.
The base section of the tail now screws onto two of the existing holes on the rear base, meaning there is no need to modify it, other than replacing two M3 bolts with longer ones.
The tail links are all held together with a 150 mm long, 5mm diameter spring, scavenged from a flexible long reach pick-up tool.
To make the models easy to 3d print, I sliced them all down the centre, then glued the halves together Loctite super glue works well with PLA (the gel type works best). There were a few failed prints in the process, which I put down to rushing, and using PLA which has been out in the air gathering moisture for several months! Prints were made on a Flashforge Creator Pro, with 15% infill.
Here are some pictures of the printing and assembly progress, leading to the final result:
After 3D printing a few more plastic parts and cutting all the aluminium plates, the custom chassis was finally complete! Below are some notes on the remaining 3D parts and the metal plates.
More 3D printed parts and painting
I printed off some of the remaining parts of the chassis: The battery compartment was best printed upright with minimal support structure needed. The rear bumper was trickier than the front bumper, because of the additional hole space for the battery, so I found it was best to print upright, with a raft and curved supports at the bottom.
Once all parts were printed, and some more sanding, I spray-painted all the parts with plastic primer, then blue paint, and finally clear sealer.
I was initially thinking of finding an online service to cut out the aluminium chassis parts, but then decided it would be faster and cheaper to just get some sheets 1.5 mm thick aluminium sheets from eBay and cut them on a jigsaw table.
I used Fusion 360’s drawing tool to export to PDF the parts I needed to cut out: four chassis plates and four foot plates. I then printed them in actual scale and glued them on the aluminium to uses as traces.
I threaded the holes on all the 3D parts, which were either 3 mm wide where the aluminium plates attach, or 2 mm at the leg and spine bracket attachment points. Using a tap for the 3 mm holes worked pretty well, but the 2 mm holes were more prone to being stripped or too loose, so manually threading the holes with the bolts worked better. Another issue was the infill surrounding the internal thread cylinder sometimes being a bit too thin. In retrospect, I’d try designing the 3D parts to use heat-set or expandable inserts, especially for the smaller threads.
The servo brackets attaching to the chassis have a large number of holes (16 for each leg and one of the spine brackets, and 12 for the other spine bracket) so the screws so far seem to secure the brackets well enough. The spine section is under a lot of stress from the wight of the whole chassis and legs, so it will not be able to withstand much twisting force, and the servos may not be strong enough at this area, but I will have to test this in practice with new walking gaits.
The custom chassis has finally made it from a 3D design to a reality, with relative success so far. Some of the threaded holes on the 3D parts are not as strong as I’d like, the AX-12 may be under-powered for the spine connection, and the brackets anchoring the spine may be the first to give way due to twisting forces. Also the chassis as a whole would benefit form weight-saving exercise and perhaps being thinned down. But this has only been the first iteration of the main chassis, and the robot design has now become a reality and seems to stand up well.
… or more correctly, from CAD to reality, as it is time for 3D printing!
Initially, before getting my own 3D printer, I used Shapeways to print copies of some of the Robotis plastic brackets (they were hard to source online).
At one point I also calculated the costs for the custom 3D printed parts, by getting quotes from Shapeways. The basic parts required totalled to £1,000! Luckily I ended up getting a FlashForge Creator Pro 2017 soon after, so was able to print the structural parts at home for much cheaper.
After first designing the initial version of each part in Fusion 360, I the made an update pass with some practical updates such as:
adding fillets around the edges
decreasing nut hole diameters by 0.2 mm in order to provide some material for self-tapping threads
increased the tolerance of some slots by the same amount, to allow a tolerance for their connection to interlocking plastic tabs
modifying the rear section to provide more rigidity to the central connection with the spine servo bracket, by adding a 90° tab to the rear underside aluminium base
Here are some images of the CAD models of the chassis parts:
Front body assembly
Rear body assembly
All parts were printed in PLA plastic.
The first part I started with was the foot base. I printed it with a 20% honeycomb infill. I didn’t add any intermediate solid layers, but might do so in other parts.
Each leg connects to a leg base bracket, which is the same design for all legs. The part was printed “upside-down” because of the orientation of the interlocking tabs. This meant that some support structure was needed for the holes. For the first print attempt I also added supports around the overhang of the filleted edge, along with a brim, but for the subsequent prints I didn’t bother with these, as the fillet overhang held fine without supports, and saved from extra filing/sanding down. These parts also used 20% infill.
For the front and rear “bumpers”, I reduced the infill to 10%.
For the larger part comprising of the central section of the front, the spine front bracket, I also used an infill of 10%. Due to the more complicated design that would have included many overhangs, I found it easier to cut the part lengthwise and print it as two separate pieces. These will be super-glued together after sanding.
Time-lapse GIFs and images of the printing process:
Spine Front Bracket
Parts & Assembly
In terms of printing times, the foot bases and leg base brackets took about 3 hours each, the bumpers took around 4 hours each, and the two spine front bracket halves took about 7 hours combined, so total printing time is was fairly large!
The 0.2 mm clearance seems to work fine for self-threading the plastic with M2 size metal nuts, but was too large for some of the plastic-to-plastic interlocking tabs, possibly since this tolerance is close to the resolution limits of the printer (theoretically a 0.4 mm nozzle and 0.18 mm layer height). However after some filing and sanding down, all the plastic parts fit together nicely.
The resulting 3D prints before and after sanding:
Finally, here are some images of how the chassis assembly shaped up, as well as the foot bases shown attached to the foot metal brackets. These fitted snug without any sanding, and all the holes aligned perfectly with the metal brackets, which was reassuring!
The next step was to glue the front bracket halves together, print the remaining parts, and paint everything.
In addition to trying to provide this robot build with autonomous behaviour (e.g. SLAM navigation), another objective is to explore various innovative ways of controlling its moving and walking behaviour., and its interaction with the user. The walking and steering motion of the robot can already be controlled via the GUI, keyboard or XBox One game controller.
So I am currently working on the improvement of the current user-controlled functions, as well as the implementation of some interesting new ones:
Fine-tune input for controlling the walking and steering motion of the robot
I have added a new input mode which allows the predefined walking gaits to be scrolled through via the keyboard or controller inputs. In effect, this means the controller can be used to “remote-control” the walking of the robot. The walking gaits still need a lot of tuning, but the basic function is now implemented.
Exploring interesting ways that a robot 'tail' can interact with the user
I am currently working on the design of a tail for the robot, as a way of adding more animal-like quality to the design, as well as a fun way of making the robot convey 'emotion' and other types of feedback. I'm not sure of the exact method of actuation, it might not have any! The design will be composed of multiple parts with similar design but with decreasing sizes, which result in a tapered snake-like appendage:
The parts will have a hollow central section, which will allow a flexible material to be threaded through for support. So far the best material I have found is the long spring section from a flexible long reach pick-up tool, which can be found cheap on eBay:
Adding a 3D sensor head to the robot
I originally had two ideas for area scanners which could be the main “eyes” of the robot. One is the Kinect v2, and the other a Scanse Sweep.
The main advantages of the Sweep is that it is designed specifically for robotics, with a large range and operation at varying light levels. On its own it only scans in a plane by spinning 360°, however it can be turned into a spherical scanner with little effort.
The Kinect has a good resolution and is focused on tracking human subjects, being able to track 6 complete skeletons and 25 joints per person. However it only works optimally indoors and at short ranges directly in front of it. It is significantly cheaper than the Sweep, but much bulkier.
However, more recently I have been looking at a third option from Intel's range of depth cameras. The Intel RealSense ZR300 seemed like an ideal choice, but since it has been discontinued, its successors from the D400 series seem like the best choice.
I rendered the various options on the robot for a size comparison:
Updating the user's graphical interface
In previous work with a robot that uses the same servos, I created a Qt-based GUI written in C++ and integrated the USB motor controller as well as the ROS ecosystem.
My idea is to re-use many of the components in this interface, in order to improve the interaction with the quadruped. This should be fairly simple, as the GUI was mostly indifferent to the physical configuration of the motors.
Here are some screenshots showing some of its features:
I added some exponential smoothing to the original walking gaits, to smooth the edges of the trajectories and create a more natural movement.
I then added a ±30° pitching motion to what best approximates the ‘ankle’ (4th joint), to emulate the heel off and heel strike phases of the gait cycle.
The range of motion of the right ankle joint. Source: Clinical Gate.
I realised however that applying the pitch to the foot target is not exactly the same as applying the pitch to the ankle joint. This is because, in the robot’s case, the ‘foot target’ is located at the centre of the lowest part of the foot, that comes into contact with the ground, whereas the ankle is higher up, at a distance of two link lengths (in terms of the kinematics, that’s a4+a5). The walking gait thus does not ends up producing the expected result (this is best explained by the animations are the end).
To account for this, I simply had to adjust the forward/backward and up/down position of the target, in response to the required pitch.
With some simple trigonometry, the fwd/back motion is adjusted by -A*sin(pitch), while the up/down motion is adjusted by +A*(1-cos(pitch)), where A is the distance a4+a5 noted previously.
Here are the details showing how the targets are adjusted for one particular phase in the creep walk gait (numbers involving translations are all normalised because of the way the gait code currently works):
Creep walk gait, 25-step window where the foot target pitch oscillates between ±30°. Original target translations (fwd/back, up/down) are shown compared to the ones adjusted in response to the pitch.
The final results of the foot pitching, with and without smoothing, are shown below:
Following on from the previous post on walking and steering, I realised that when moving the spine joints, the rear feet remain anchored to the ground, when it would be better if they rotated around the spine motors, to give a better turning circle for steering.
The reason behind why the feet remain fixed is because their targets are being defined in world coordinate space, so moving the spine won’t change the target.
There are advantages to defining the targets in world space for future work, when the robot knows more information about its environment. For example the legs can be positioned in the world in order to navigate upcoming terrain or obstacles. But for now, it is often useful to work in coordinates local to the base (front base for front legs, and rear base for rear legs), since in this way you don’t have to worry about the relative positioning of the front base w.r.t. rear.
I will eventually update the kinematics code so either world or local targets can be selected.
For now however, I have made an update to the code, so if the spine joint sliders, gaits or walking/steering inputs are used, the rear leg targets move with the spine. To explain this better visually:
Another minor adjustment you might notice was the widening of the stance, to provide a larger support polygon. The walking gaits still need fine-tuning, as walking on the actual robot is still unstable and slow.
First, I have updated the walking gaits with additional target values. Second, I have added a new input mode which allows the predefined walking gaits to be scrolled through via the keyboard or controller inputs. In effect, this means the controller can be used to “remote-control” the walking of the robot! The walking gaits still need a lot of tuning, but the basic function is now implemented.
I have updated the CSV spreadsheet for gait data, so that it now includes the 5 possible degrees-of-freedom of each foot (XYZ and Roll/Pitch), the 6 DoF of the base, and the 2 spine joints.
The walking gait’s updated list of foot target values (first 50 out of 100).
The foot target values visualised (base and spine joints not shown).
In Python, all the CSV data is loaded into an array. One of the keyboard/controller inputs can now also be used to update an index, that scrolls forwards/backwards through the array’s rows.
Next, to get the robot to turn, a second input controls a deflection value which adjusts one of the spine joints and the base orientation (as was mentioned in a past post). The deflection slowly decreases back to 0, if the input is also 0.
By doing this, the walking gait can be controlled at will by the two inputs, and hopefully make the robot walk and turn. Next comes the fine-tuning and testing!
I have been testing the movement of the robot’s base in the world, while keeping the legs fixed to the ground, as a test of the robot’s stability and flexibility.
The robot base can now be controlled, either via the GUI, keyboard or gamepad, in the following ways:
Translation in XYZ
Movement of the two spine joints – Front of robot remains still, while rear adjusts
Movement of the two spine joints – Front of robot attempts to counteract the motion of the rear
You may notice the real robot can’t move its upper leg all the way horizontally as the IK might suggest is possible, because there is a small clash between the AX-12 and the metal bracket, but this should be fixed by filing or bending the curved metal tabs:
I have recently written an OpenCM sketch to control the robot servos, in a way similar to how it was being done with the older Arbotix-M, but this time using the Robotis libraries for communicating with the motors.
I have also been making various updates to the Python test code, with a few of the main issues being:
Improved the code for positioning the base and base target in world
Updated base/spine transforms – Front legs now move with base, not first spine joint
Fixed the leg IK – Legs now remain in line with ground when the base moves
Added new keyboard/joystick input modes for controlling base position, base orientation, spine joints
Updated the serial string sending function and fixed some minor issues
Moved a load of script configuration variables to a separate Params module
Added a combo box to the GUI front-end for loading a selection of CSV files (as an update to the previous two fixed buttons)
With the hardware for all four legs gathered, I have assembled the first standalone version of the quadruped. The MakerBeam XL aluminium profiles were adopted as before, to create a temporary chassis.
The fact that the robot can now stand on its four feet meant I could quickly give the walking gaits a test on the real hardware: The Python test software reads the up/down and forward/back position of each leg for a number of frames that make up a walking gait, the IK is solved, and the resulting joint values are streamed via serial over to the Arbotix-M, which simply updates the servo goal positions. No balancing or tweaking has been done whatsoever yet, which is why in the video I have to hold on to the back of the robot to prevent it from tipping backwards.
A chance for some new photos:
I took some time to make a video of the progress so far on this robot project:
Finally, here is an older video showing the Xbox One controller controlling the IK’s foot target position, and a simple serial program on the ArbotiX-M which receives the resulting position commands for each motor (try to overlook the bad video quality):
In the next stage I will start building the robot's body, as per the CAD design, which is for the most part 3D printed parts and aluminium sheets, combined with the 2 DoF "spine" joints.