Close
0%
0%

Cardware

An educational system designed to bring AI and complex
robotics into the home and school on a budget.

Similar projects worth following
Cardware is an educational system designed to bring AI and complex robotics into the home and school on a budget. It leverages cheap off-the-shelf electronics and kinematics - eg Raspberry Pi Zero and hobby servos plus a cheap webcam - and templated cardboard parts
that fold into each of the pieces needed to join them together and make
a moving chassis.
It is controlled by a distributed system containing a language that can incorporate it's own feedback as well as instruct it, giving it a limited ability to learn. The system also handles all of
the geometry necessary to articulate limbs and other parts using natural language on the part of the user, handles speech and visual recognition and features a rudimentary nervous system to differentiate touch across the surface of the cardboard.

All work contained in this project is subject to Creative Commons Licensing.

The Concept

Modern robotics hardware for the hobby market has been reduced to a series of modules, from servos to processors. The only things connecting these together are wire and chassis, and I've been experimenting with using easy to obtain materials to build those chassis. Polycarbonate and polystyrene, polyethylene, sheet aluminium etc were natural choices.

Mark's IO made me think again about the validity of cardboard as a working material, and we began a dialogue that culminated in this collaboration.

The idea is to use a single sheet of craft card and a printed template to create chassis units that connect the electromechanical building blocks together.

We have since iterated over a few changes to the design, incorporating better geometry to give the parts strength, room for cabling and better overall appearance.

The first generation limb mounted on a spare thigh section so I can test the servos...

Mark Nesselhaus was able to replicate three of the pieces so far. If he can do it then anyone can. When using Super Glue please watch the fingers.

Once we had a working prototype chassis, the next job was to get the nervous system up and running.

Mark is now waiting for his 'brain' and 'muscles' to arrive via the postage system (fingers crossed) so he can duplicate the next stage himself... This is the real meat of the project, and forms the interactive part of the cardboard. I'm working on internalising the touch panels for the 2nd generation shell.

The nervous system in action on a first gen limb,attached to my PC. I'd managed to get it working how it was intended - it mimics the biological hallmark of withdrawing from a stimulus, and learning from the experience.

If you watch carefully, you can see that the servos are live and are holding the limb in position until I touch a surface, which it then withdraws from by a short distance. The system will record these motions and tag them to a hierarchical structure so you can 'program' the robot by touch alone. This only requires an MCU, but we are also using a RasPi with a camera and microphone to detect motion and shapes and respond to voice commands.

Testing the system on a Pi revealed the touch panels to be a lot more sensitive, and the cardboard literally came alive. I'm working on a way to integrate both these behaviours into the system.


Origaime - The next generation

The new parts build in exactly the same way as the first set, they fold up from a single piece of cardboard to make a modular piece for the robot. Mark calls this Origaime, after the original robot that inspired everything and the art of folding. :-)

Both these pieces were accomplished using the redesigned saddle that fits over the servo actuator and a stud fitted opposite it to make a live hinge.

The section that joins these together to make active limbs as also made from one piece. This part carries the servos and the stud for the saddle to rotate over. Eventually there will be a range of part styles that can make any articulation you like, not just basic limbs.

This has been reworked to include the servo bearers out of the chassis instead of separate pieces. This better fits the Way of Origaime...

As I said, these parts are modular. Here's the other end being prototyped onto the shell at the body end. I hope hip replacements get this easy in the future.

And the shell itself of course. This features a customisable panel in the bottom which is used to provide ventilation for the electronics and also structural integrity. This part has the greatest amount of stress to deal with and needs to be solid. I chose a pattern I knew would do this, but then I realised this could be used to aesthetic effect, and I'm going over a few designs for graphics in here. Watch this space... ;-)

Here's a picture of what it looks like after a couple of strong beers. Nobody's perfect. ;-)

It needs a lid still, that is for now a simple octohedral dome with a port for the camera. I'm still working on this too.

All together...

Read more »

aimos_core.ino

Arduino code to control servos synchronously from an asynchronous control structure

x-arduino - 19.04 kB - 03/23/2017 at 21:42

Download

aimosdriver

Python code to interact with AIMos Core

aimosdriver - 6.60 kB - 03/23/2017 at 21:42

Download

hackaday_theme.doc

MS Word format DOC containing replacement shell templates themed with a stylised HaD skull. Scaled for US Letter prints, will also print on A4 without modification.

application/msword - 699.50 kB - 03/28/2017 at 16:32

Download

hackaday_theme.odt

Open Documents format ODT containing replacement shell templates themed with a stylised HaD skull. Scaled for US Letter prints, will also print on A4 without modification.

application/vnd.oasis.opendocument.text - 698.27 kB - 03/28/2017 at 16:32

Download

origaime.doc

MS Word format DOC containing the full prints for Cardware Origaime v3.2 Scaled for US Letter prints, will also print on A4 without modification.

application/msword - 2.87 MB - 03/28/2017 at 16:32

Download

View all 6 files

  • 1 × Atmega 1284P for basic system Or Arduino to host the Core systems : Nervous system, kinematics and digital sensors
  • 1 × Optional Raspberry Pi Zero, A, B, Camera etc Or Beagle etc to host the AI systems : Audio/visual interaction, learning, enhanced sensors
  • 1 × Software AIMos Core and AIMos UX, AIMil Language
  • 1 × Chassis Hardware Downloadable Chassis Templates, your own choice of materials
  • 1 × Tools No 1 Posidrive screwdriver, scalpel or craft knife, scissors, glue, Optionally PC for control and Core programming

  • The Eternal Battle; Light vs Dark

    Morning.Star11/30/2017 at 09:55 0 comments

    Shading

    Before I can make this model look real I have to do a bit of work regarding how the light falls on it, and it gets a bit complicated because the real one is curved, and the virtual one is made of plane surfaces.

    Calculating the path of a beam of light (RayTracing) as it bounces off a surface isnt easy, you need to know the angle of the beam of light, and the angle of the surface, both in 3D. This has to be done after the model is posed for the render, so the only way to know the surface angles for sure is to know their original angles, and back-track the gimballing in the same manner as the leg is calculated.

    What, for every pixel? I dont think so... That sounds like hard work. Lets think about it a different way then.

    Cloud Theory to the rescue (again)

    The model is represented as a cloud of atoms with planar surfaces, tiny squares. Each and every one of them has a unique position and angle of incidence, which is calculable, to give the visible area of the model.

    I use IsFacing ( <coords > ) to tell me whether to draw these polygons, because it calculates the area covered by the polygon. If it is negative, it is back-to-front. However, it just returns true or false. If you look at the actual area returned, it is a percentage of its full area when facing the camera directly. When it is sideways-on, it has an area of zero, no matter which way it is facing, left-right or up-down.

    This means, by finding the difference between the drawn area and the original, I can tell how much it is facing me without knowing its actual angle, and shade it accordingly.

    All polygons facing have 100% area, all polygons side-on have 0% area, and all polygons with -100% area are facing away. This makes it very simple to resolve the shading from the camera, and then rotate the scene to show the shading from another angle.

    Because it is a rotation, the area and angle are related together by the Sine of the angle. The full area can be calculated by finding the lengths of the sides in 3D, as X,Y,Z offsets as two triangles.

    The Law of Cosines could then be applied to give the angles of the sides, and it could be drawn flat without knowing what angle it is facing in any plane...

    However, after a bit of digging through my research I realised you can find the area of a triangle much more simply if you know the sides without calculating any angles. Heron's Formula uses the perimeter and ratios of the sides forming it, much nicer. I'd forgotten about that one, usually you wont know all three sides without work so its just as quick to find the height.

    Once it is drawn flat, the area can be calculated as two Euclidean triangles, and the Cartesian Area subtracted from it to give the angle of incidence as a tangent.

    The resulting figure is a real value between 0 and 1, giving the amount of atoms visible as a percentage. Because each atom has no height dimension, any atom viewed from the side is not visible, and also covers any atoms behind it so they are no longer visible. This means a 10x10 atom square viewed from the side has an area of 10 atoms; its length or width, because all you can see are the leading 10 atoms on the edge. Viewed from directly facing, it has 100 atoms visible. And at 45 degrees only half of each atom shows behind the atom in front, making the area half.

    This means I can calculate how much light falls on any pixel without knowing the angle of the surface it is in, just by calculating how much of it shows. Because I also dont need to know the angle of the light-rays either (they are parallel to the camera axis and therefore zero) I can do this after posing the model, using the information embedded in it as an abstract.

    Because each polygon...

    Read more »

  • Constituto Imaginalia

    Morning.Star11/28/2017 at 09:06 0 comments

    From Google Translate : Organised Pixels.

    Hang on a minute, how'd that get into Latin anyway? :-D

    Ever Closer...

    The equation governing the shape of the virtual robot now includes every single pixel on it's surface. They are all now vectors in their own right, and after a bit of tinkering with a skins editor, will all be uniquely addressable via a flat image that folds around the mesh. That's drawn using the same bitmap for every panel.

    I'm only tinkering at the moment, I'm waiting for one of these to arrive.

    Adafruit 16-Channel 12-bit PWM/Servo Shield - I2C interface

    Adafruit 12-bit 16 Channel PWM/Servo Shield with I2C interface

    Yeah I know, the shame lol. I hate to resort to a shield and one from Adafruit at that, but I'm thoroughly sick of the industry making it impossible to join the various modules it uses together without their glue and platform. Plus the power supply issues were insoluble without them. The amount of servos, plus the fact they sink a quarter-amp each and need 6v meant I can no longer buy the components reliably. The rechargeable batteries run at 3.7V, which is a maddening figure. With two in series you get a little over 7V under load, which will toast a servo but not allow enough overhead for a decent 6V regulator. There are low-drop types, but they dont supply 3 amps, even the SMD ones if you can find them.

    I mean, its only a chip, but unless I pay somebody to mount it with a decent power converter and all the headers I cant use it. And the Point Field Kinematics doesnt squeeze into an Atmel 328 alongside the servo sequencer.

    I decided to drive the servos straight off the Pi, discovered it only had only one PWM... So I went looking and that was the first thing I encountered. Oh, nuts, I clicked BUY before I thought about it too hard.

    Providing the servos run off 3.7V, and I've seen it done, this should work with my existing hardware.

    Reading the limb sensors

    I completely forgot about the potentiometers in the joints... A Pi has no ADC built in unlike the 328, and I need 4 for each limb, 8 in total. Even with a 328 onboard, thats still not enough, and 2 of them seems daft just for their analogue inputs.

    I've done a bit of digging to see if there is an ADC I can attach to a GPIO, and Adafruit have one ready made. I can get hold of the chips for that myself, and they use SPI to communicate so that's easy.

    MCP3008 - 8-bit 10-channel ADC.

    Pretty much plug the chip into the Pi's GPIOs and cable it to the sensors and thats done. Pi has software SPI so I can configure the GPIOs in code. I've ordered one, but it wont arrive until the weekend.

    Bummer, I'll just have to polish some pixels until it all arrives.

  • Bouncing off the walls

    Morning.Star11/23/2017 at 16:29 0 comments

    Mapping is the next big push, mathematically speaking. Luckily I have a head-start on that, I had begun working on it with the original AIME, and have some scrappy code snippets I found randomly on the 'net. I dont know who authored this piece of logic, but they deserve a medal.

    bool pinside(poly2d poly, double x, double y) {
    
      int   i, j=poly.nodes-1;
      bool  oddNodes=false;
    
      for (i=0; i<poly.nodes; i++) {
        if ((poly.coord[i].y< y && poly.coord[j].y>=y
        ||   poly.coord[j].y< y && poly.coord[i].y>=y)
        &&  (poly.coord[i].x<=x || poly.coord[j].x<=x)) {
          oddNodes^=(poly.coord[i].x+(y-poly.coord[i].y)/(poly.coord[j].y-poly.coord[i].y)*(poly.coord[j].x-poly.coord[i].x)<x); }
        j=i; }
    
      return oddNodes; }

    What it does, is establish whether a point is inside or outside of a complex polygon. Simple polygons are easy, a rectangle particularly so because you can just test it as a matrix. Any cell with a negative index or greater than the width or height is obviously outside. Regular polygons are harder, but still fairly simple - checking the angles of the sides against the line between each corner and the test point to see if they are all larger than that angle is one way of doing it.

    But what about horrible shapes like these (and they will be really easy to create by exploring...)

    That second one is particularly nasty. Inkscape has a special routine to handle it in one of several ways depending on how you want it filled. This is default; areas that cross areas already used by the shape are considered outside, unless they are crossed again by a subsequent area. This is how the code snippet interprets a polygon like that, and its useful for making areas from paths for example.

    It works by counting the number of boundaries crossed by exiting the polygon from the test point. Obviously a simple regular polygon like a hexagon would return 1 boundary crossed, an odd number of boundaries crossed will always indicate the test point being inside the shape, and an even number of boundaries crossed will indicate outside (providing the crossed-areas rule is observed.)

    This is of particular use for sequential-turn mapping of an area, because it identifies the areas enclosed by the path that need exploring, as turns are generated by obstacles that might not be boundaries.

  • I'm walkin here

    Morning.Star11/22/2017 at 09:54 0 comments


    I've finally managed to puzzle out the gimballing for the limbs to the point where I can address them from either end of the chain.

    I discovered it wasnt possible to mirror the mesh internally to handle standing on the opposite foot, as I'd planned. It became a nightmare involving such tricks as defining the second mesh inside-out, with anticlockwise polygons, but the gimballing then was reversed too and I gave up, I cant get my head around that.

    Instead, I've defined the robot from the left foot, which, when it isnt being stood on, calculates its position backwards from the other foot and then rotates the entire model to make that foot face downwards. Providing all the interstitial rotates are symmetrical, the feet remain parallel to the floor and the sum of the servo angles all add up to 180.

    This is highly useful, because it applies with the limbs in a position to step up onto a platform as well as with both of them flat on the floor... Using this information, I've defined a series of poses for the limbs that I can interpolate between and get smooth motion, I've called these seeds, and all the robot's motions will be defined using them.

    Here is a full seed set cycled around. I've tilted the display so the part rotates are clearer, but this isnt included in the world coordinate system yet and the position jumps. I'll be refining these, and adding Z rotates for the hips so it can turn. This adds yet another calculation for the far foot position in the mapping system and I'm not there yet.

    The next task is to add footprints to the world coordinates, which will enable mapping and platforms, but first I have to integrate the balancing routine and switch that on and off periodically. The seeds are intended to bring the system close to balance, so that I can also use inertial forces later in the development. This will be a matter of timing as well as servo positions; currently I'm working on mapping mode, where it has to determine if there is floor to step onto and thus balance on one foot.

  • Twas brillig, and the slithy Toves did gyre and gymble

    Morning.Star11/13/2017 at 09:48 0 comments

    Point Cloud Kinematics

    There's actually a lot more to it than meets the eye as far as information is concerned, but it's embedded and bloody hard to get to, because its several layers of integrals each with their own meta-information. I've touched on Cloud Theory before, and used it to solve many problems including this one, but for a cloud to have structure requires a bit of extra work mathematically.

    Our universe, being atomic, relies on embedded information to give it form. Look at a piece of shiny metal, its pretty flat and solid, but zoom in on a microscope and you see hills and valleys, great rifts into the surface and it doesnt even look flat any more.

    Zoom in further with a scanning electron microscope and you begin to see order - regular patterns as the atoms themselves stack in polyhedral forms.

    If you could zoom in further you'd see very little, because the components of an atom are so small even a single photon cant bounce off of them. In fact so small they only exist because they are there, and they are only 'there' because of boundaries formed by opposing forces creating an event horizon - a point at which an electron for example to be considered part of an atom or not. It's an orbital system much like the solar system, its size is governed by the mass within it, which is the sum of all the orbiting parts. That in turn governs where it can be in a structure, and the structure's material behaviour relies upon it as meta-information.

    To describe a material mathematically, you then have to also supply information about how it is built - much as an architect supplies meta-information to a builder by using a standard brick size. Without this information the building wont be to scale, even the scale written on the plan. And yet, that information does not appear on the plan, brick size is a meta; information that describes information.

    A cloud is a special type of information. It contains no data. It IS data, as a unit, but it is formed solely of meta-information. Each particle in the cloud is only there because another refers to it, so a cloud either exists or it doesnt as an entity, and is only an entity when it contains information. It is self-referential so all the elements refer only to other elements within the set, and it doesnt have a root like a tree of information does.

    A neural network is a good example of this type of information, as is a complete dictionary. Every word in the language has a meaning which is described by other words, each of which are also described. Reading a dictionary as a hypertext document can be done, however you'd visit words like 'to', 'and' and 'the' rather a few times before you were done with accessing every word in it at least once. You could draw this map of hops from word to word, and that drawing is the meta-map for the language, it's syntax embedded in the list of words. Given wholemeal to a computer, it enables clever tricks like Siri, which isnt very intelligent even though it understands sentence construction and the meaning of the words within a phrase. There's more, context, which supplies information not even contained in the words. Structure...

    This meta-information is why I've applied cloud theory to robotics, and so far it has covered language processing, visual recognition and now balance, and even though the maths is complicated to create it, cloud-based analysis of the surface of the robot is a lot simpler than the trigonometry required to calculate the physics as well.

    But its not all obvious...

    I first tried to create a framework for the parts to hang off of and immediately ran into trouble with Gimballing. I figured it would be a simple task to assign a series of coordinates from which I could obtain angle and radius information, modify it, and then write it back to the framework coordinates.

    This works, and hangs the parts off correctly using the axes to offset each part....

    Read more »

  • Pixelium Illuminatus

    Morning.Star11/08/2017 at 09:39 0 comments

    And other arcane mutterings.

    After meeting a dead-end in AIMos with the image recognition based on pixels, I realised I'd have to find a way to either make them triangular to use Euclidean math on them, or, find a way to make Euclidean math work on polygons with 4 sides to match the Cartesian geometry of a photo. Digital images are pretty lazy, just a grid of dots with integer dimensions reduced to a list of colours and a width and height.

    It isnt immediately obvious but that isnt how a computer handles a pixel on screen because of scalable resolution. Once inside, it has 4 corners with coordinates 0,0 , 1,0 , 1,1 and 0,1 and happens to be square and the right size to fit under a single dot on the display. The display is designed for this, and modern monitors can even upscale less pixels to give a decent approximation of a lower resolution image.

    This interpolation, averaging of values, can also be used to reshape an image by getting rid of the pixels completely, which turned out to be the answer to the problem.

    Cardware's internal rendering system hybridises Euclidean and Cartesian geometry to produce a bitmesh, which is a resolution-independent representation of a digital image. It cant improve the resolution of the image, so it works underneath it, using several polygons to represent one pixel and never less than one per pixel.

    This is achieved by using the original resolution to set the maximal size of the polygons, and then averaging the colours of the underlaying pixels. Then whenever that polygon is reshaped, it maintains the detail contained in it as well as the detail between it and its neighbours independently of the screen grid. Taking the maximum length of the sides and using that as the numeric base for the averaging does this a lot faster than Fourier, even Fast Fourier routines to abstract and resolve the pixel boundaries.

    Because the system now has an abstraction of the image, it can be played with so long as the rules of its abstraction are obeyed. Everything is in clockwise order from out to in, or left-to right and down as per writing, and has logical boundaries in the data that obey Cartesian rules. This means I can use Pythagorean maths, but handled like Euclidean triangles with unpolarised angles that are simply relative to each other.

    Triangles are unavoidable, but I struck on the idea of making them 4 sided so they handle the same as squares and dont require special consideration. A zero-length side and a zero theta does not interfere with any of the maths I've used so far, and only caused one small problem with an old routine I imported from AIMos. That was easy to write a special case for, and isnt really part of the maths itself, but part of the display routines.

    Here's a stalled draw from the Python POC code showing the quality the system can achieve. I was expecting Castle Wolfentstein, but this is going to be more like Halo, near-photographic resolution, and fast too.

    The Python that draws this is the calculation routine with pixels obtained from the source map and re-plotted. Once the polymap has been deformed by the mesh and rotated into place those pixels will be polygons and the holes will disappear. The original was 800x600 and takes around 12 seconds to fully render. Once in C++ this will come down to a fraction of a second for a photo quality shell-render of the entire robot, maybe a few frames a second if I'm careful.

    Not in a Pi though, so compromises will have to be made...


    Yeah I know, walking, I'm not ignoring it.

    Actually this maths is all related directly to it as well as the recognition system and the perceptual model I'm trying to build...

    OK so now I know where the actual centre of a triangle is without an awful lot of messing around. That's an equilateral and quite easy to calculate, but a right-angled triangle, of which you find two of in a square will give you...

    Read more »

  • Building Worlds

    Morning.Star10/30/2017 at 11:07 0 comments

    ...And populating them

    * Now with code to handle multiple objects, save and load objects to a human-readable file and rotate the scene with the cursor keys. This is as about as far as I am taking the Python code as it has served its purpose, to design and assemble the models. There is a bit of tidying up needed, a routine to attach the limbs in a proper chain using the joint angles to display the entire model... And another to export the parts as WaveFront object files, which is pretty easy. Then Cardware can be 3D-printed, milled and rendered in almost any rendering package commercial or otherwise, be simulated in many engines, and, represented by a networked Cardware engine and optionally attached to the sensors on a real Cardware model...

    Most of this code will be rewritten in C++ so I can draw those polygons with my own routines and handle the depth on a per-pixel level. This also means I can wrap a photograph of the real one around it. Python is way too slow for this without a library, and PyGame's polygon routines would still be terrible if they understood 3D coordinates. ;-)

    I suppose I have to make that skins editor I toyed with while printing the plans now, to make that easy. Double-sided prints are possible but are a bit of a pain to line up properly and I dont have cash for colour cartridges laying around, or at least some of my Metal Minister would have been photographic. Using a skins editor can make that simple, and it isnt just for pretty although that is a consideration;

    The main reason is to map the robot's markings to its geometry so it is recognisable to another. Because they are a shared system this information is available to all robots in the network so they can say this is where I am and this is what I look like, so the visual recognition system can verify it and link it to the world math. But, it doesnt have to be a real robot, it can equally be a simulation which the individual robot's 'mind' wont be aware of. Even the simulated ones...

    We beep, therefore we are. :-D

    To understand the world, and indeed itself, a robot needs to metricate everything and represent it mathematically. It is very important for movement that the robot also understands its perimeters and not just the angles of the joints, so it has situational awareness. So, the first thing I had to do was measure everything and create mathematical models for it to represent itself with.

    First the easy bits, simple to pull measurements directly off the plans in MM. I'm using that because it fits better into denary math and I can round it to the nearest MM without affecting the shape. The card itself has a 0.5MM width and the physical model can vary by that anyway.

    Turns out the model can deviate quite a lot from the plans even when they are printed and folded to make the actual model, and accuracy has little to do with it in practice on some parts. More on that later...

    The Thigh Module

    Hip and Ankle Modules are simply half of the above part, easy to generate a mesh for during import. The legendary Hip Unit (Units are unpowered, Modules are powered) was already measured to calculate the inner radius from the outer, a 2mm difference in diameter.

    The Hip Unit

    The foot is more complicated. Filled curved shapes are a nightmare to compute edges for, so I've broken them into a mesh. This was done manually in Inkscape from the original drawing.

    Overlaid with polygons and then measured, thats the foot done too.

    The Foot Module

    The lid was a little bit more complicated. While I can draw orthogonal plans to scale I'm not entirely sure thats accurate to the MM in all three dimensions. The original MorningStar was not calculated to be the shape it was, I discovered the fold almost accidentally and then figured out the mathematics for it, then used them to compute the lid dimensions as curves. Interpolations from that math was...

    Read more »

  • More Power Igor

    Morning.Star10/21/2017 at 10:36 0 comments

    Well, finally I was let down by a lack of amps, the batteries I have just arent strong enough to move the limbs without at least doubling them. And I'll need to upgrade the regulators. Such is life...

    I added a little bit of code to the Pi's monitor. The raw_input() function was going to be replaced by an open() and list() on a script containing mnemonics.

    def injector():
      global port,done,packet
      
      print 'Starting input thread.'
      
      joints=['anklex','ankley','knee','hipx','hipz']
      cmds=['movs','movm','movf','movi']
      syms=['s','m','f','i']
      rngs=[15,10,5,1]
      while not done:
        cmd=-1
        spd=-1
        srv=-1
        agl=-1
        print '->',
        i=raw_input('')
        if i!='':
          inp=i.split(' ')
          if len(inp)==3:
            srv=0
            if inp[0] in cmds: cmd=cmds.index(inp[0])
            if inp[1] in joints: srv=joints.index(inp[1])+1
            try:
              agl=int(inp[2])
            except: agl=-1
            if cmd>-1:
              spd=rngs[syms.index(cmds[cmd][3])]
            if cmd>-1 and srv>-1 and agl>-1 and spd>-1:
              checksum=1+srv+agl+spd # int(i1)+int(i2)+int(i3)+int(i4)+int(i5)
              chk2=(checksum & 65280)/256
              chk1=(checksum & 255)
              port.write(chr(chk1)+chr(chk2)+chr(1)+chr(srv)+chr(agl)+chr(spd)+chr(0))
              sleep(1)
        else: done=True

    This would allow me to program motions for the servos, if they actually moved. They hum, but they just dont have the juice.

    Closing existing port... Port not open
    Clearing buffers...
    Connected!
    Starting input thread.
    -> movm knee 255
    Servo: 10 10 128
    Move: 10 128
    Move: 9 140
    Move: 8 153
    Move: 7 166
    Move: 6 178
    Move: 5 191
    Move: 4 204
    Move: 3 216
    Move: 2 229
    Move: 1 242
    -> movm knee 128
    Servo: 10 10 255
    Move: 10 255
    Move: 9 242
    Move: 8 229
    Move: 7 216
    Move: 6 204
    Move: 5 191
    Move: 4 178
    Move: 3 166
    Move: 2 153
    Move: 1 140
    -> movm anklex 0
    Move: 10 128
    Move: 9 115
    Move: 8 102
    Move: 7 89
    Move: 6 76
    Move: 5 64
    Move: 4 51
    Move: 3 38
    Move: 2 25
    Move: 1 12
    -> movm anklex 128
    Move: 10 0
    Move: 9 12
    Move: 8 25
    Move: 7 38
    Move: 6 51
    Move: 5 64
    Move: 4 76
    Move: 3 89
    Move: 2 102
    Move: 1 115
    -> movm anklex 255
    Move: 10 128
    Move: 9 140
    Move: 8 153
    Move: 7 166
    Move: 6 178
    Move: 5 191
    Move: 4 204
    Move: 3 216
    Move: 2 229
    Move: 1 242
    -> movi anklex 128
    Move: 1 255
    -> movi anklex 0
    Move: 1 128
    -> movs anklex 0
    Move: 15 0

     This is the above code interacting with a live processor controlling the servos. For now I have just defined the following instructions.

    • MOVI - Move Instant; sets the servo angle to the destination in one step.
    • MOVS, MOVM and MOVF - Move Slow, Medium, Fast; sets the servo to the position specified interpolating by decreasing numbers of steps.
    • The servos are specified by name, AnkleX and AnkleY, Knee, HipY and HipZ, and angle is specified directly as an 8-bit number between 0 and 255.

    These execute over time inside the Atmel without further instruction, freeing up the ports and main processor for the task of observing the sensors and interjecting to compensate. It isnt ideal, but it is a lot faster than trying to directly control the servos over those serial links.

    I'm going to take a little break from this for a while. Or I will begin to dislike it... And I need to paint, and play my guitar some to chill out.

  • Broken but not bowed

    Morning.Star10/20/2017 at 13:14 0 comments

    OK so I've had a good cuss at those rotten processors. Other than commenting that 'Well, I think MOST of them worked when we packed them...' just isnt good enough when money changes hands over it I'm done with them.

    Maybe I'll come back to them later along with the multiprocessor networking, when I've had a break...

    Meantime, fuelled by panic and a weak but nonetheless calming solution of Ethyl, I rewired the processor board after snapping off and junking the ESP partition. I had to do this, I could not get either of the two ATMega 1284P-PU chips to program. One's in a Sanguino with Sprinter installed on it, I did that myself so I know its good. Except it isnt. I wasted an hour or two on it and that followed the ESP onto the pile of failed crap I paid money for. Four Duinos and two ESPs, a handful of voltage dividers and a few chips just lately.

    Oh well trusty ATMega 328P-PU's it is then. Each has its own UART and the control code on the Pi talks to the processor via the USB ports, leaving one spare for camera and one for WiFi dongle. There is the option of putting a precompiled solution on the ESP and interfacing to it via the Pi's 40-way connector using #pi2wifi .

    Thats now operational and contains a working servo sequencer.

    I have managed to rewrite the multicore_ini code to interface directly with the PC and got it working.

    #include <Servo.h> 
    const int host=1;                                 // this processor
    const int servos=5;
    // servo definition structure:
    // articulation
    //      |____ servo (servo object)
    //      |          |____ attach()
    //      |          |____ write()
    //      |____ pin (physical pin number)
    //      |____ min (minimum range of movement 0-255)
    //      |____ max (maximum range of movement 0-255)
    //      |____ home (home position defaults to 128; 90 degrees)
    //      |____ position (positional information)
    //                 |____ next (endpoint of movement)
    //                 |____ pos (current position as float)
    //                 |____ last (beginpoint of movement)
    //                 |____ steps (resolution of movement)
    //                 |____ step (pointer into movement range)
    //
    // packet configuration:
    // byte 1: header - 2 bytes checksum
    // byte 2: control - 1 byte packet type
    // byte 3: parameters - 1 byte meta
    // byte 3: data 1 - arbitrarily assigned
    // byte 4: data 2 - arbitrarily assigned
    // byte 5: data 3 - arbitrarily assigned
    struct servo_position {                           // servo status
      int next;
      float pos;
      int last;
      int steps;
      int step;
    } ;
    typedef struct servo_position servo_pos;          // atmel c++ curiosity, substructs need a hard reference
    struct servo_definition {                         // servo information
      Servo servo;
      int pin;
      int min;
      int max;
      int home;
      servo_pos position;
    } ;
    typedef struct servo_definition servo_def;        // servo structure containing all relevant servo information
    servo_def articulation[servos];                   // array of servo structures describing the limb attached to it
    int mins[]={ 0,0,0,0,0,0,0,0,0,0,0,0 };           // defaults for the servo ranges and positions
    int maxs[]={ 255,255,255,255,255,0,0,0,0,0,0,0 };
    int homes[]={ 128,128,128,128,128,0,0,0,0,0,0,0 };
    unsigned char check,checksum,chk1,chk2,ctl,prm,b1,b2,b3;
    void setup() {
      Serial.begin(115200);
      while (!Serial) { ; }                           // wait for the port to be available
      for (int s=0; s<servos; s++) {                  // iterate servos
        articulation[s].servo.attach(s+2);            // configure pin as servo
        articulation[s].pin=s+2;                      // echo this in the structure
        articulation[s].home=homes[s];                // configure the structure from defaults
        articulation[s].min=mins[s];
        articulation[s].max=maxs[s];
        articulation[s].position.next=homes[s];
        articulation[s].position.pos=homes[s];
        articulation[s].position.last=homes[s];
        articulation[s].position.steps=0;
      }
      
        for (int d=0; d<1000; d++) {                    // garbage clear
          if (Serial.available() > 0) { unsigned char dummy=Serial.read(); }
          delay(1);
        }  
    } 
    void loop() { 
      if (Serial.available() >= 7) {                  // if there is a packet
        chk1=Serial.read();                          // read the packet
        chk2=Serial.read();
        ctl=Serial.read();
        prm=Serial.read();
        b1=Serial.read();
        b2=Serial.read();
        b3=Serial.read();
        checksum=chk1+(chk2*256);
        check=ctl+prm+b1+b2+b3;
        if (checksum!=check)...
    Read more »

  • Wish I'd seen this coming

    Morning.Star10/19/2017 at 07:34 0 comments

    F*ing ESP8266, I've had my doubts about it right from the start.

    I built this to control the servos...

    Using the multiprocessor serial networking. The TX of each is on a flylead that plugs into the RX of the next, and its broken out on the top left corner so I can connect the TX and RX of the UART.
    Its powered by a 5V 2A supply, and the whole thing is paranoically well grounded and shielded to prevent any crosstalk.

    Here the ESP is connected directly to the UART and has been programmed with multicore.ino:
    const int host=1;
    const int packlen=1;
    unsigned char bytes;
    int bootdelay;
    void setup() {
      Serial.begin(115200);                           // open serial port
      while (!Serial) { ; }                           // wait for the port to be available
      if (host==1) { bootdelay=1000; } else { bootdelay=5000; }
      for (int d=0; d<bootdelay; d++) {               // garbage collect delay
        if (Serial.available() > 0) { unsigned char dummy=Serial.read(); }
        delay(1);
      }
      
    } 
    void loop() { 
      int val1;
      if (Serial.available() > 0) {                  // if there is a packet
        bytes=Serial.read();                          // read the packet
        val1=(int)bytes;
     
          Serial.write(bytes);                    // write packet
          
      }
    }

    Which has had everything stripped out so all it does is return what it is sent. It can be uploaded to the ESP and the Atmel without modification and should perform the same. On an Atmel, viewed from a simple port scanner and byte injector script:

    Port not open
    Clearing buffers...
    Connected!
    Starting input thread.
    Send command byte
    Byte : 1
    00000001 - 01 - 1  
    Send command byte
    Byte : 2
    00000010 - 02 - 2  
    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    Send command byte
    Byte : 255
    11111111 - ff - 255 - �
    Send command byte
    Byte : 
    Thread done!
    Main done!
    

    However, uploaded to an ESP (and I have now tried two, one from eBay and one that Ars sent me) it does this instead.

    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    00000000 - 00 - 0  
    00000000 - 00 - 0  
    11111110 - fe - 254 - �
    11111110 - fe - 254 - �
    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    Send command byte
    Byte : 4
    10000010 - 82 - 130 - �
    Send command byte
    Byte : 1
    11000000 - c0 - 192 - 
    Send command byte
    Byte : 1
    11000000 - c0 - 192 - 
    Send command byte
    Byte : 2
    00000010 - 02 - 2  
    11111111 - ff - 255 - �
    00000000 - 00 - 0  
    00000000 - 00 - 0  
    11111111 - ff - 255 - �
    Send command byte
    Byte : 00000000 - 00 - 0  
    00000000 - 00 - 0  
    11111100 - fc - 252 - 
    11111110 - fe - 254 - �
    11111110 - fe - 254 - �
    11110000 - f0 - 240 - 
    11111100 - fc - 252 - 
    11111111 - ff - 255 - �
    11110000 - f0 - 240 - 
    11111110 - fe - 254 - �
    11111010 - fa - 250 - 
    11111111 - ff - 255 - �
    11111111 - ff - 255 - �
    00000000 - 00 - 0  
    11100000 - e0 - 224 - 
    10000000 - 80 - 128 - �
    11000000 - c0 - 192 - 
    11111000 - f8 - 248 - 

    Of particular note is the last section. While I was sitting and puzzling over where all the spare bytes were coming from, an few more appeared at random. Then a few more, and more as I watched. Eventually it filled up the terminal over about 10 minutes in random bursts.

    The Atmels dont do it using the same code on the same board, same power supply and connected to the same rig, so I really have no explanation other than I have two bad processors and I've wasted my time trying to get something broken to work. Its making up data on its own, no wonder I couldnt get anything meaningful out of perfectly good code - which I wrote blind I might add. Runs on the Atmels without modification, but I had to add garbage collection and checksumming just to get it to run with the ESP and now I know why.

    I've built two motherboards, one using brand-new hardware, hacked a test rig with flyleads and Arduinos and a ESP programmer board I built, nothing worked, finally I strip it all back and watch it taking the piss...

View all 32 project logs

  • 1
    Step 1

    First print out the full set of parts on 200-220GSM card stock. If you are using a dark coloured card, print the templates onto 85GSM photocopy paper and glue it to the card.

    Score all of the dotted lines well with a sharp point and a ruler. A ball-point pen is perfect for this as it also gives you a visual reference. (Thanks @K.C. Lee) Be careful not to score too deeply and damage the card so it tears. You just need it to bend in the right place.

    Cut out each of the parts and bend them into the basic form. It should be obvious where the tabs glue together by test-folding the pattern into shape.

    Wiring the antennae is fiddly but simple. Joint them all together and insert them into the part. Be careful to label or otherwise mark which wire is which or this will have to be done by trial and error in the software.

  • 2
    Step 2

    The modules are joined together with one screw through the saddle on one side that goes through a plastic bearer to stop the card wearing through. The other side of the saddle secures to the actuator arm on the servo with two screws.

  • 3
    Step 3

    Saddles are complicated but easy to do when you know how. Its just a matter of following the sequence of gluing because some of those joints are multi-layer and internal. It is logical however, if it doesnt fit you are trying to fit it wrong. I cant stress the benefits of test-folding the part to see how it fits before gluing.

View all 4 instructions

Enjoy this project?

Share

Discussions

Morning.Star wrote 03/16/2017 at 08:02 point

Cardware officially welcomes another collaborator.

Irony begs me to point out that I now have two cardboard Generals, Mark One and Mark Two.

XD

  Are you sure? yes | no

markgeo wrote 03/15/2017 at 06:54 point

Your latest update is even more awesome. I'm a version behind on my building due to not having enough servos. Currently waiting on more servos to arrive in the mail. Looks like I better place another order because I definitely want to try the Origaime version.

  Are you sure? yes | no

Morning.Star wrote 03/15/2017 at 16:17 point

Thanks Mark. Lol, I have the same problem. I so want one of these running around already!!

I've posted the latest generation files and instructions to give you a good head start. Mark Nesselhaus has fed back a few things from his build that I have incorporated into them. You'll also need an ATMega1284P or larger, say a 2560 based Arduino and the toolchain to make the sensory bus work.

* Spoiler alert * We are now into the stereoscopic vision system.

We are looking for collaborators to test our designs, you'd be welcome to a project log of your own describing your build. ;-) Welcome to the Way of Origaime!

  Are you sure? yes | no

Duncan29 wrote 03/02/2017 at 17:03 point

absolutely brilliant mate nice work,    Gee, Brain, what do you want to do tonight The same thing we do every night, Pinky - try to take over the world! :) ;)

  Are you sure? yes | no

Morning.Star wrote 03/02/2017 at 18:43 point

Narf!

Cheers Dave :-)

  Are you sure? yes | no

markgeo wrote 03/02/2017 at 12:53 point

Wow. Your update shows how ambitious and awesome this project is. I will definitely be following along.

I had to do several print-cut-and-try cycles with slightly different scaling to get pieces to fit. I used some carboard shims under the servo mounting ears that not only fixed the fit but added additional material for the screws to bite, similar to what you did by doubling the bears.

Do you plan to release your code when you are finished?

  Are you sure? yes | no

Morning.Star wrote 03/02/2017 at 18:42 point

Thank you, thats kind. I've put an awful lot of research into this over years, and working with @Mark Nesselhaus has realised a lot of it. It would not have been possible without his input.

Hmmm. I would have to speak to Mark about that. While Cardware is intended to be open and flexible so anyone can use it, we're not just giving it away ;-)

  Are you sure? yes | no

markgeo wrote 03/02/2017 at 02:38 point

I really like this. I tried building it and found that the depth of the thigh is a bit shallow - the servos don't sit all the way down with the wires routed under them and a screw inserted below them for attachment to the yoke. It's a simple mod to increase the depth of the servo compartment.

  Are you sure? yes | no

Dr. Cockroach wrote 03/02/2017 at 08:02 point

Good morning, There will be some updates getting posted shortly as well as a surprise so check back later today :-)

  Are you sure? yes | no

Morning.Star wrote 03/02/2017 at 13:01 point

Hi Mark.

Thank you for your comments. I've added an update containing a properly scaled bitmap for those who wish to try this for themselves. There are also a couple of videos showing Cardware's nervous system in action. This is way more than just a model, it's living cardboard that can see, hear, feel and learn from you.

  Are you sure? yes | no

Andrew wrote 02/24/2017 at 01:49 point

Have you seen this? http://homofaciens.de/technics-machines-cnc-v3-0_en.htm

I have built several, and the thick card (1.2 or 1.5mm) with box construction is very strong.  You can see there are some pieces glued inside the hollow parts to strengthen it.  You could redesign your parts so that instead of folding, and instead of tabs, you have more, discrete, parts glued together.  Remember to compensate for the thickness of the card and take care to note whether parts are attached by their inside edge or outer face.

  Are you sure? yes | no

Morning.Star wrote 02/24/2017 at 09:26 point

Another incredible piece of work that only a handful seem to know about. I did a bit of googling to see if anyone else was up to this too, and found no-one. Seems they dont want to make a big deal of the fact that it's cardboard??

Thanks for the comments, always welcome. :-) This is still a proof-of concept and there is a lot of work to do. One thing that everyone seems to have missed is that Cardware robots do not live forever, but their parts do. It's aimed at younger makers and modellers, not just geeks.

There is a lot more to the project than these simple polyhedrons, we're trying to make card and paper interactive on a level that hasnt been done before. This limb is a chassis for a sensor system as well as imagination, combined with #AIMOS it brings cardboard to life.

  Are you sure? yes | no

Morning.Star wrote 02/24/2017 at 09:47 point

I like your #Raspberry Pi Fermentation Controller BTW. Brown water / Python, chuckle...

I built a computer controlled 'nanobrewery' called Brewnel a number of years ago to make beer from grain in a single tub, it contains an immersion element, filters and pumps, as well as a thermocouple and a weight gauge in the base to handle ingredient metrics. I should post this. The software was written in VB6, just before I met Debian Linux and I dont have the source any more, It will take more time than I have spare to rework it unfortunately, but it uses Arduino to switch relays for the hardware.

:-)

  Are you sure? yes | no

ðeshipu wrote 02/23/2017 at 22:29 point

By the way, have you seen this project? http://zoobotics.de/project/zuri-01-3/

  Are you sure? yes | no

Morning.Star wrote 02/23/2017 at 23:08 point

No, I hadn't! These are all my designs based on conversations with a man who likes to think outside the box, same as me. Thanks for the heads-up, I'll have to send in the flying monkey-bots to deal with them. ;-)

Luckily zoobotics manufacture their robots as kits. The entire point of Cardware is to remove the need for ordering physical parts, and expensive machinery to produce them in the first place. Things like die or laser cutters, 3D printers are not needed, just an ordinary printer. Also Cardware is designed to fold and glue, not assemble, and contains significant other advances over Zoobotics (Although that is stunning work) in it's interactivity.

Watch this space ;-)

  Are you sure? yes | no

ðeshipu wrote 02/23/2017 at 23:37 point

I'm not saying you stole the idea or they do the same thing, I just thought that you could actually steal some ideas from them, because they deal in a similar medium, so similar things will work. It might be even worth it to contact them and discuss some of the challenges they had!

  Are you sure? yes | no

Morning.Star wrote 02/24/2017 at 00:16 point

@Radomir Dopieralski I still like the flying monkey-bots plan myself lol. But you do have a point. Cheers for the interest...

  Are you sure? yes | no

ðeshipu wrote 02/18/2017 at 17:30 point

I spy a spider leg in there!

  Are you sure? yes | no

Morning.Star wrote 02/18/2017 at 18:25 point

Hey Radomir. Yes you do, this contains part of #AIMOS, which is what the first generation based on. I'm also going to recreate AIME in cardboard, thanks to Mark "#The Cardboard Computer - IO is my name" Nesselhaus. Cheers for the follow and skull, I love your #Tote

  Are you sure? yes | no

ðeshipu wrote 02/18/2017 at 18:35 point

Now I know who is Mark. All the best with your project, we need more spider robots!

  Are you sure? yes | no

Duncan29 wrote 02/18/2017 at 15:12 point

Cant wait to see this in action :)

  Are you sure? yes | no

Morning.Star wrote 02/18/2017 at 15:51 point

Wont be too long Dave, we're now into prototyping the basic shells. There will be a few, I hope eventually people will make their own.

Thanks for the follow and skull :-)

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates