Micro Robots for Education

We aim to make multi-robot systems a viable way to introduce students to the delight that is robotics.

Similar projects worth following
We are building small robots and the accompanying software for the educational market. The aim is that the robots will set a new standard for compact and cost effective educational experiments.

A set of robots will be managed by a master system based on a Raspberry Pi, for now named 'god bot'. This will handle multiplexing the IR communications, extracting the robot locations using it's camera and communications with the internet etc. It will also be the platform on which students will develop applications, for most uses the robots themselves can just use 'standard' firmware.

The design methodology is to keep things cheap and simple. We want each robot to cost ~10gbp per unit at quantity, and each robot should be under 2cm square. This is to combat the price sensitivity of schools, and limited space classrooms have for multi agent robotic systems.

This project spawned from teaching efforts within the Imperial College Robotics Society. We run a very popular course known as 'Robotics 101' which aims to teach students (of University age) the basics of robotics. The hope being we can attract other disciplines (Mechanical engineering, Biology etc) towards being hobbyists in the Robotics world. The issue we observed is that after the course, when the students had their typical A5ish robots completed, it was difficult to then move onto bigger systems. We do have a 102 course within ICRS that tackles multirobot systems, but when each robot is that size, the arena needs to be pretty big, much to the despair of the fire martials (we need to block the fire exit with the arena). Also due to the size and to some extent the cost too, students need to all share the same equipment. We love sharing our stuff of course, but sometimes the thrill of discovery is killed by waiting in line.

This is where our robots come in! Small enough that a complete multirobot system can operate within the confines of a single desk space, and cheap enough that multiple sets can be deployed per class room. All whilst maintaining the capabilities of your typical 101 robot: Front facing colision sensors, line following sensor, IR uplink and downlink, stepper motor based odometry and RGB indicator LED.

We want students of all ages to see past the typical line following or object avoidance tasks and think bigger. Swarm behaviour, competitive sport, collaborative task solving, having multiple robots at your disposal makes designing these systems possible.

Whilst the robots do have a few local sensors, for collision avoidance and line following/boundry detection, their main location sensing will be provided by an overhead camera and transmitted over IR to the robots. Using this in combination with the tiny stepper motor odometry, complicated trajectories are possible.

So please LIKE and FOLLOW this project if you are interested, we will soon have more tech demos posted and we wouldn't like you to all miss out!

(Early releases, not recommended for those who prefer code that operates, and boards that are free of bugs ;) )

PCB source. Shared under Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) .

Software source. Shared under GNU General Public License V3.0 .

Did we mention these things are small!

Compared to a typical 'educational robot'.

PS: If you are interested in tiny robots, I imagine that you would think large, high speed robots are cool too. Check out Oskar Weigl's project over at . Low cost, devastatingly powerful servo motors can be yours!

  • ROS Integration: Location

    Joshua Elsdon10/23/2017 at 10:51 0 comments

    Hello! This is another dejavu post. We have already presented a system that used a camera system to located the robots. Since then I have gone to a large robotics conference (for an unrelated project), I took the micro robots with me and they were a minor hit amongst the people that I had impromptu demo with.  The lesson learned from this trip was that any demo of the project should be portable and quick to set up. Therefore a carefully calibrated system with controlled lighting and a mount for the camera is a nono!

    In the future I hope to integrate the following onto a single board computer of some kind, minimising the amount of trailing wires and inconvenience. Any way without further ado, what have I built/

    Firstly with a free moving camera there is no simple way to use it as a reference as with our old implementation. Therefore we need to put a reference in the environment. In our case a small QR code. This QR code now can represent (0,0,0) for the robots. Luckily ROS has a very capable package for this task ar_track_alvar . This package provides the transform between the camera and all QR codes that are visible in the scene. Easy as pi eh? 

    Next we need to find the robots in the camera image. The easy way would be to put QR codes onto the robots, however they are too small. Also this would make each robot unique, which is awkward when you want to make 100s of them. Therefore we are going to have to do it using good ol'fashioned computer vision. For this I used the OpenCV library. The library is packaged with ROS also, using some helper packages, things work smoothly. 

    The actual algorithm for finding the robots is not too complicated. We threshold the incoming image to find the LED and the front connector that has been painted red with nail polish. These thresholded images are then used as an input to a blob detector.  The outcome is we should have a number of candidates for LEDs and front connectors. 

    Next we need to pair them up. To do this I project all of the points from camera space to the XY plane defined by the QR code. This gives us the 'true' 3D position of these features, as I make sure that the real QR code is at the same level as the robots top surface. 

    Now for each potential LED we have found we search all potential front connectors, if the distance between the features is the same as they are in real life (13.7mm) then we can say this is very likely a robot. 

    Next we do a check to ensure our naming of the robots is consistent with what they were in the previous iteration. This is simply associating the potential robot with the actual robot that is closest to it (based on where it was last time and the accumulated instructions sent to the robot). 

    Finally we convert the positions of the LED and front connector to a coordinate system that is centred on the centre of the wheel base. This is then published to the ROS system. 

    The outcome is that ROS has a consistent transform available for each of the robots in the system, as a bonus we also have the position of the camera in the same coordinate system. This means we could make the robots face the user, or have their controls always be aligned with the user position. (RC control is very hard when the vehicle is coming towards you, this could be a good remedy.) 

    A long wordy post is not much without a video, so hear you go: 

    Here is a picture of the very awkward setup as it stands:

  • TOF Laser Scanner Assembled.

    Joshua Elsdon10/17/2017 at 10:57 1 comment

    An update on the progress of the robot 'hat'. This is a TOF laser sensor that can be plugged into the programming port of the robots. With a range of ~2m, this can scan a circle of ~4m with just a quick spin of the robot. Will be made on flex PCB to integrate a 90 degree bend. Probably the smallest PCB I have ever made.  Bring up on this board is not a priority, as there are core features that are still needed for the robot project, also debugging could be a pain, as these use up the debugging port. When you get used to the STM break point debugging, flashy LED debugging seems dreadfully inefficient. 

  • ROS Integration take 2.

    Joshua Elsdon10/04/2017 at 11:38 2 comments

    Hello again.

    ROS: Did I do this before?

    ROS: Robotic Operating System is a key part of modern robotic research. The glue that lets academics and industrial system designers package their work in a way that is easy to share and stick to other modules. If you are interested in modern robotics and don't know about ROS then get to work on the tutorials! There are alternatives to ROS, though ROS has 'won', for now. 

    I am keen to have these robots fully linked up with ROS, such that I can be lazy when wanting them to do fancy things. When a robot is connected to ROS and has the relevant sensors and actuators, it can leverage pre-made cutting edge research code with just a little XML jiggery-pokery and perhaps a few simple c++/python nodes to do conversions. Pretty awesome. 

    I made a post quite a while ago that claimed ROS integration was working on the robots, and I wasn't lying. The problem however was that the implementation was for version 3.0 of the robots, AKA arduino on wheels. Now we are in version 5.0 (and 6.0 is ready to send to manufacture) which uses a different micro controller. In redesigning the code to run on the new hardware we decided to make the communication layer far more versatile. The old version was just spoofing NEC commands over IR, and each robot would have their own set of commands to allow them to move around. This is OK, though there is lots of redundancy in the NEC protocol, and the commands are sent at a very low rate. This meant that if you had more than a couple of robots you would be starved for data to each. Also there was no provision for up link to the master system. 

    The new implementation makes use of UART as the base for the communication scheme, and this is one of the key differences between V4 and V5, the IR receiver is connected to the UART Rx pin. All you need to do is modulate your UART at 38kHz and you are good to go. In both the robots and the master this is achieved by letting the LED straddle the Tx pin and a PWM pin, when the PWM is strutting it's stuff a high voltage on the Tx pin will light the LED only when the PWM pin is low. Now we can just use the UART hardware as usual and the LED will be modulated for free. 

    Where art thou packets?

    Now that we have bidirectional UART (half duplex) we can start doing some more sophisticated communication. Because the IR channel can be very unreliable, a sensible packet needs to be proposed. luckily I happen to be writing a blog post about just that issue!  The packet follows, each [] is one byte:

    [header][header][robots][255-robots][type][robot0 ins][robot1 ins][.....][checksum] 

    Above is the current packet implementation, which is likely to change to include better validation for many-to-one communication packets (like asking all robots for their battery voltage). 

    [header] is just a special byte to find the start of the packet, in my case 0xAC, as it has a fun alternation of bits. We do this twice, to lower the chance of collision with data, though that would not be a big deal if it did happen. 

    [robots] is just the number of robots in the system, and [255-robots] is just some redundancy to double check this number, the sum of these should always be 255, getting this wrong could put the robot into a long loop waiting for the wrong number of robot commands. 

    [type] is the command type, for example 'move', 'report battery voltage' or 'enter line following behaviour'. 

    [robot0 ins][robot1 ins][.....] is then the array of instruction bytes to each of the robots. Each robot is likely going to ignore all of these apart from it's own instruction. Though you could imagine that the information given to other robots could help the robot plan locally in some cases. All robots know how long the array will be due to receiving the robots value previously. 

    [checksum] is a checksum, such that we can perform a sum then check the sum matches the checksum. How many sums could a checksum check...

    Read more »

  • Version 5.0 and Hats.

    Joshua Elsdon08/18/2017 at 20:36 0 comments


    Whilst it seemed a good idea at the time to make a new project on Hackaday IO to cover the developments of the robots as a more or less mature project. It turns out I disagree with my past self significantly enough to go back on my decision. So updates will be placed here to keep the continuity with all the cool stuff we got up to over the last couple of years. 

    The robots are now in version 5.0. What upgrades did this bring? To be honest, nothing spectacular. We moved to having 2 upright pins for charging, this was to remove the faff of constructing the ground connection on the bottom of the robots. This was a very manual process and would not be able to be scaled. The upright pins also double as the shafts for the wheels, therefore they can charge on the shaft protrusions! What an enviable talent! 

    I made a little box that leverages this feature such that they can all be charged whilst in storage. Even little robots need a comfortable home.  The other changes were just in the pins chosen for particular peripherals to connect to, as there was a mistake that one of the light sensors was not connected to a analogue capable pin, DOH!

    Having surveyed some of the competing robots, it seems popular to offer an expansion system. Usually this is a row of headers with useful interfaces broken out. Luckily we thought of this ahead of time and broke the I2C pins out on the 8 pin programming header. So you can plug in the additional thing you need in there and you are done.   

    Wait, the end of that last paragraph was not convincing?! OK, I would admit there are not a huge selection of sensors that you can get off the shelf that will fit in that particular position,  pinout etc. But we are on HaD, so why not design something.

    I thought that the idea of having a laser scanner on the robot would be pretty cool for all of those micro SLAM projects that you are working on. Luckily Texas Instruments make a very nifty, and very small ToF distance sensor. It is called the  VL53L0X. So sensor picked. Next we need to plug it in. On a bigger robot you could just use one of the many board to board connectors, or use a flex cable to a similar header on the daughter board. Unfortunately there is not space for such luxuries, so we need to find a more compact solution. 

    Luckily for my Thermal Watch project I need a flex PCB, so why not solve loads of problems with it all at once. Below you can see my design for a tiny folding front ToF distance sensor 'Hat'. 

     The surrounding parts of the PCB are just where it will panelise for manufacture, The centre piece will be cut free. The part will be plugged into the programming header and bend at 90 degrees such that the sensor faces forwards. Obviously this is not perfect as the sensor will have to be removed for reprogramming, though this is the price we need to pay to be able to fit upto 5 of these robots in our mouths at once. If you want to see a robot where you could perhaps fit 10 units in your mouth then check out my even smaller design here. << warning robot too small. 

    Version 6.0 is already drawn up and ready to send away. It has a super stealthy feature that I look forward to showing you all. 

  • V4.0 It lives!

    Joshua Elsdon03/16/2017 at 15:46 1 comment

    Hello everybody. V4 is alive and well. Below are a few upgrades we have implemented.

    • Upgraded to STM32F0 processor. 48Mhz lots of nice peripherals 32k program memory and 4k SRAM, not to shabby.
    • Front facing sensors are now a type that actually face forwards, therefore we no longer need optics to redirect the light into them.
    • The single upright pin from the previous design is now a 2 upright pins. These are used for on-the-fly charging. Bump into 5v and charge away. They are reverse polarity protected. So don't live your life in fear.
    • The programming header can now be plugged in backwards. It won't function the wrong way around, but it won't explode either.
    • The 'hips' of the robot have been shifted inwards to account for the face plate of the motors. This allows the wheels to run much closer to the robot's main board and make the footprint even smaller.
    • The upgraded processor has enough PWM pins (which are correctly connected) to implement phase control of the motors, as per 'micro-stepping'. This allow the robot to turn the motors slowly without introducing large vibrations that cause the robot (V3) to lose traction.
    • The wheels have been improved (Not strictly an upgrade to the PCB design, though I care about wheels too much to leave them out).

    So this version is significantly better. Though every design needs an appropriate erata! Things that are wrong:

    • The IR reciever IC is connected to a poor pin on the uController. This means we cannot use the hardware UART to recieve serial. Also the particular timer on that pin is not sophisticated enough to implement the best version of the library that recieves remote controller commands. So in general communication to the robot is now awkward.
    • One of the front IR sensors is not connected to an ADC pin. Rendering it basically useless, unless your light levels are such that the digital threshold happens to be correct.
    • Whilst not breaking the design, due to the keep out area allowing the axle to pass through, there are too many components near to the battery terminal. This makes soldering in the battery a dicey business. You are only ever 8mil or so away from shorting the LiPo battery out and causing a fire. This will be solved by splitting the axle in two. Using an 'L' shaped piece of wire one limb will be the shaft and the other will be the upright pin for charging purposes. This also means that charging pins coming from the centre of the wheels on the side is also an option. (Makes me think of the chariots from "Gladiator")

    Below are some videos and pictures of the newest version for your viewing pleasure.

  • Where have we been? Wheels of course.

    Joshua Elsdon02/01/2017 at 15:00 2 comments


    If any of you have been following this project you may have noticed that the main problem we need to overcome is making the wheels for the robot. If you scan through previous logs there are something like 4 full posts on the subject (I have lost count). Today may be the last, one can only hope.

    The wheels in question need to be very small, and they have a very fine module gear teeth running around the perimeter to engage with the motor pinion. I have tried to date: Model wheels (friction driven), stamped out rubber disks (friction driven), printed on Roland DLP printer, cast from a negative printed on the Roland, printed on a zcorp (glue dust together) printer, and finally - building my own printer for the express purpose of producing these wheels.

    Quite the journey, the result: look for yourself!

    You can see that the teeth on the wheel is as well defined as the teeth on the pinion. The pinion gear comes with the motors and are presumably injection moulded in a big fancy factory. These were printed with FunToDo Deep black (settings: 20u layers, ~70u pixel size, 27s exposure per layer). The detail is pretty crazy, though at ~35s per layer they are not quick to print, so I will work on optimising the print parameters such that they take a more sensible time. Check out the printer on my other project page CLICK HERE FOR PRINTER DETAILS , though I warn you the project is likely to be rather more sparse than this one for details, though all the info you would need to build it is on the page, it is really just one linear slide in a box, so I am sure you clever people can fill in the gaps!

    The primary use for the printer till it gets decommissioned is to produce wheels, though maybe it will also produce some useful jigs for small scale manufacture.

    These layers are 70u, and they are stretching how thick is possible for this resin. This is because it is so black all the UV is absorbed before being able to cure more deeply than this. With some optimisation it might be possible to get to 100u at around 30 second cure time (I found the projector was on Eco before, doh!). Also seeing as I wish these wheels to be printed directly on the bed I may modify the model to account for the wider-than-designed first layer.

  • Panelisation: Robot Arrays Ahoy!

    Joshua Elsdon12/23/2016 at 14:03 6 comments

    Hello all,

    Christmas is nearly upon us, and just in time Elecrow delivered a present for us! The next stage in development is finding a way of producing a number of these robots on a small scale. The solution in industry is to create large panels of the boards in question and use pick and place machines to populate them. We wish to do the same thing on a smaller scale, enter mini-panels!

    We can get these pick and placed on site without needing to make 100s+ at a time. A batch of 11 robots and a programmer is roughly the size of one 'kit' of robots in our estimation. Below is an image highlighting the techniques we used for panelisation. The programming pins are broken out to the 0.1 inch headers for slightly easier programming in batches. Ideally we would have thought about things a little more to get them to be programmable from one port, but alas we are lazy.

    To be honest I was expecting the manufacturer to complain about the panelisation, as you usually pay more for it, though we tried our luck and boom, no extra charge ;).

    You can see that this is V4.0, rather than V3.0 which starred in the other posts. V4.0 has some nice new features including:

    • Reduced width at back allowing for a much reduced over all width. (wheels can now be tight to the robot)
    • We are using a stm32 processor, this has more pwm hardware, higher clock speeds and crystal-less operation.
    • The collision sensors at the front are now a separate LED and 2 phototransistors. This is so they properly 'look' forward.
    • The upright charging pin is now 2 upright charging pins, one 5v the other GND. These have reverse polarity protection. Having a sliding GND contact on the bottom was not feasible due to fiddlyness to build.
    • Due to more careful thought and more PWM hardware available we should be able to efficiently have microstepped motors, allowing for less vibration when they are turning slowly.
    • These will be the first version to benefit from an unhealthy obsession with getting the perfect wheel (As will all new versions so long as I have an unobtainable standard for wheels I suppose). You can check out my DLP printer build here . I will be optimising this printer for the production of these tiny wheels.

    For now Christmas is here, though in the New Year a new robot army will be upon us!

  • Experiment in Wheels: Part 4.

    Joshua Elsdon12/16/2016 at 16:10 0 comments


    So it seems the core issue in this project is how to reliably make the wheels. The resin 3D printer I used before made some ok wheels, though the tolerances where not reliable, also as the optical path became foggy the quality of the teeth degraded.

    This time I am experimenting with the SLA process. Conclusion: they are not as smooth as the resin wheels, though they are more consistent. This makes them more usable for this application where a single dodgy tooth renders the wheel useless.

    The main problem cited with SLA printers such as the ZCorp machine I am using is brittleness of the parts. This is indeed true, the wheels can be easily be broken by hand. Though the robots weigh only 5g and application of forces in this magnitude will break almost everything on the robot, so the wheels are not a weak link. They are also white enough to star in a tooth paste advert, which I like very much ;)

  • An update: What has been done in the last 6 months?

    Joshua Elsdon12/13/2016 at 15:30 0 comments

    Unfortunately not too much! Life gets in the way at times, as does my crippling laziness. We have made some modifications to the design and they are sent to the board house. We hope to have a few demo robots up and running in the new year. I am continuing my experimentation with tiny wheels, on that topic an update will come soon!

    The new board release contains 11 robots and an st-link adapter, this is for ease of small scale pick and place. It is unlikely we will be making millions of these things any time soon, but the ability to bash out 11 new minions when needed on the local pick and place machine should be a boon for distributing robots to interested parties.

    The new design includes proper forward facing proximity sensors, dual upright charging pins with polarity protection and an updated processor with more of everything! And we hope to implement some proper motor control with the help of our house motor control expert ;) Also have the robots run a simple operating system to aid in the modularity of adding new features.

    Fingers crossed that the next update is sooner than 6 months time!

  • Hackathon!

    Joshua Elsdon06/20/2016 at 13:24 1 comment

    Hello again,

    This weekend we took part in the delightful Hackathon focused on 'Robotics in Education' organised by the Imperial College Advanced Hackspace. We met more than our fair share of interesting people, and even got some work done on the project on the side ;) As part of the Hackathon we were introduced to some new team mates to help us for the 2 week period, and especially the final hack frenzy that happened this weekend. So say hello to Christian, (Tom and Myself), Kenny and Nick! Depending on their schedules we hope that we can continue to work together.

    Christian enthusiastically oversaw the team effort to focus ideas and help us appeal to teachers, kids and technicians alike. So thanks to him we have more streamlined ideas, nice graphics and a fantastic presentation to use in future. Kenny worked on a number of things including investigating a block programming interface, and a web connected interface, he also happens to be a ROS connoisseur. Nick provided us some grounding in what children would be interested in and what is tenable from a teachers stand point, he works as a Software engineer and an after school STEM ambassador, which is a pretty useful set of skills for this project. Nick also helped us prototype a sanitised Python interface for the 'activity' part of the software, such that we are more able to fit in with the UK curriculum in this area. Tom and I continued on the ROS integration and robot firmware. A shot of us being studious; free food and coffee, and an amazing venue!

    Some modifications to the firmware unleashed the beastly speed of the litte robots, now we can proudly say that power sliding is a potential issue we will have to deal with. Here is a video, the spinning is in real time, the movement forward is in slo-mo.

    There is a small update for anyone who is still listening. A fully integrated demo comming soon I hope.

View all 22 project logs

  • 1
    Step 1

    First a word of warning: the design is not yet finished, continue at your own peril.

  • 2
    Step 2

    Download the sources from the Circuit maker page for the PCB, you may need to install the software to do this, I am not sure. Get the PCBs made with 1mm board thickness. They are 4 layers and you almost certainly want to order a paste stencil.

  • 3
    Step 3

    After waiting for the board to arrive, order the components that are in the BOM, careful, it has not been fully checked and there may be some silly resistors that we changed on the fly.

View all 9 instructions

Enjoy this project?



EngineerAllen wrote 02/01/2017 at 21:42 point

impressive soldering

i like the motors

where can i get some?

  Are you sure? yes | no

cglassa wrote 11/07/2016 at 00:41 point

I am really battling to compile the firmware. I am very new to both STM32 and ROS, so I have spent a lot of time reading and following the instructions - particularly the ones found on

I set up Ubuntu Linux and installed ROS Kinetic, but I just can't seem to get the launch file to work and am almost out of ideas on how to get past what seems to be the last step. 

Would it be possible to get simple step by step / more detailed instructions? 

It has been a very expensive project for me, and although I'd prefer instructions, I would even be happy with a V4 bot firmware to test if the electronics I have made work. 

Any help at all would be really appreciated.

  Are you sure? yes | no

Joshua Elsdon wrote 11/28/2016 at 16:53 point

Sorry I have not been on the website for a little while. I am a little confused at what state your are. Have you build a robot? V4? V3? We have no v4 firmware atm as we have not built v4 ourselves yet! (project stalled due to work). Are you running on a Pi3? or a desktop? I would like to help you out. All of this project is very much experimental so I cannot offer step by step instructions that would remain valid for any length of time, I would answer questions to you directly though. 

  Are you sure? yes | no

cglassa wrote 12/12/2016 at 13:15 point

I built and a assembled a bot based on the V4 schematics shared on Circuitmaker ( I have been trying to compile the github source in a Ubuntu VM setup on my desktop. I have tried multiple Ubuntu and ROS versions - no luck compiling yet. It might help to know the ideal recommended desktop environment. I am able to connect to the MCU using a ST-LINK/V2 to read the empty STM32F031G6U6, but without any firmware to load, I am not able to do more than that. I will patiently keep on trying whilst following the project's development. If I eventually get something to compile I will post an update here.

  Are you sure? yes | no

Joshua Elsdon wrote 12/13/2016 at 14:57 point

(apologies, this forum seems to not let me reply to nested messages). Wow. The project was on a little bit of a break due to day work getting the best of me, though you have officially overtaken us lol :P There is no firmware for v4 as we have not built one yet, all firmware is for V3 based on the atmega chip. With ROS I would go with Ubuntu 16.04 and ROS Kinetic, though any 'matching' versions of ROS and Ubuntu would be ok. 

I have only just ordered the PCBs for the new version myself, on which there are a few small modifications which should not be that important if you have already build some. The new V4.1 is manufactured in an array on a small panel to make pick and placing easier, each panel has 11 robots and 1 st-link adapter. So we have 110 potential robots on their way from china. Also, if the pick and place goes well I would be happy to forward you some complete robots at cost price + postage or some raw PCBs for free. 

May I ask what your application for the robots is? Also what country are you in? 

  Are you sure? yes | no

cglassa wrote 12/14/2016 at 02:30 point

Thanks for the bot offer, but hopefully I will be able to get at least one of my version of these awesome little bots up and running... eventually! :)

For me, this was originally intended as a PCBA learning exercise - with the hope of having a really cool toy to play with at the end. 

The firmware side involves 'just a tad' more learning than I was expecting and, after yet another unsuccessful late night, it is officially not my favourite part of this project.

I look forward to seeing the development of your V4.1 bots!

  Are you sure? yes | no

George I Fomitchev wrote 08/15/2016 at 06:41 point

stunning and unbelivable... 

we try to help educators with a standard size robot

  Are you sure? yes | no

Jin W wrote 05/27/2016 at 17:05 point

Is it possible for you to publish a schematic and parts list? I would love to build some of my own!

  Are you sure? yes | no

Joshua Elsdon wrote 05/27/2016 at 18:18 point

Yes we hope to bring the files up this weekend. For the moment we would not strongly recommend building them, as there is a new version that is halfway done. The current design will become obsolete in short order. We want to get a core system that does all the basic things with a full software stack supporting it before we open source everything.

  Are you sure? yes | no

Joshua Elsdon wrote 05/29/2016 at 21:54 point

Hey hey hey, Just put the sources up (see log or the end of Details), though the below disclaimer is still very true, new version is on the horizon, this version has significant flaws.  

  Are you sure? yes | no

Overdesign wrote 05/25/2016 at 04:22 point

Where are these little steppers from?

  Are you sure? yes | no

davedarko wrote 05/25/2016 at 07:21 point

Probably from cameras with zoom / focus lenses. You can find them on eBay.

  Are you sure? yes | no

Overdesign wrote 05/25/2016 at 08:17 point

Thanks for the thought, davedarko. I wonder.

  Are you sure? yes | no

Joshua Elsdon wrote 05/25/2016 at 11:29 point

Yes they are from cameras. They are pretty cool, though they are incredibly fragile. We have 7 finished robots ant 4 of them are out of action due to broken phases in the motors. They can be purchased here: 

  Are you sure? yes | no

Overdesign wrote 05/25/2016 at 14:43 point

Nice! Thanks for the confirmation. :-)

  Are you sure? yes | no

Overdesign wrote 05/25/2016 at 14:44 point

(and the review ;-) )

  Are you sure? yes | no

K Gilbert wrote 05/30/2016 at 01:02 point

I'm having some difficulty soldering these, the pins seem to just barely fit into bread board but i cannot get a good grip, any suggestions?

  Are you sure? yes | no

Joshua Elsdon wrote 05/30/2016 at 14:29 point

@K Gilbert Hey there, I would not recommend using these with a breadboard. If you are careful you may be able to solder some extension wires on. Make sure you use very thin flexible wire to avoid yanking out the pins. Ideally you should have a PCB to play with these motors, even with one they are damn fiddly and fragile. 

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates