close-circle
Close
0%
0%

Cardware

An educational system designed to bring AI and complex
robotics into the home and school on a budget.

Similar projects worth following
close
Cardware is an educational system designed to bring AI and complex robotics into the home and school on a budget. It leverages cheap off-the-shelf electronics and kinematics - eg Raspberry Pi Zero and hobby servos plus a cheap webcam - and templated cardboard parts
that fold into each of the pieces needed to join them together and make
a moving chassis.
It is controlled by a distributed system containing a language that can incorporate it's own feedback as well as instruct it, giving it a limited ability to learn. The system also handles all of
the geometry necessary to articulate limbs and other parts using natural language on the part of the user, handles speech and visual recognition and features a rudimentary nervous system to differentiate touch across the surface of the cardboard.

All work contained in this project is subject to Creative Commons Licensing.

The Concept

Modern robotics hardware for the hobby market has been reduced to a series of modules, from servos to processors. The only things connecting these together are wire and chassis, and I've been experimenting with using easy to obtain materials to build those chassis. Polycarbonate and polystyrene, polyethylene, sheet aluminium etc were natural choices.

Mark's IO made me think again about the validity of cardboard as a working material, and we began a dialogue that culminated in this collaboration.

The idea is to use a single sheet of craft card and a printed template to create chassis units that connect the electromechanical building blocks together.

We have since iterated over a few changes to the design, incorporating better geometry to give the parts strength, room for cabling and better overall appearance.

The first generation limb mounted on a spare thigh section so I can test the servos...

Mark Nesselhaus was able to replicate three of the pieces so far. If he can do it then anyone can. When using Super Glue please watch the fingers.

Once we had a working prototype chassis, the next job was to get the nervous system up and running.

Mark is now waiting for his 'brain' and 'muscles' to arrive via the postage system (fingers crossed) so he can duplicate the next stage himself... This is the real meat of the project, and forms the interactive part of the cardboard. I'm working on internalising the touch panels for the 2nd generation shell.

The nervous system in action on a first gen limb,attached to my PC. I'd managed to get it working how it was intended - it mimics the biological hallmark of withdrawing from a stimulus, and learning from the experience.

If you watch carefully, you can see that the servos are live and are holding the limb in position until I touch a surface, which it then withdraws from by a short distance. The system will record these motions and tag them to a hierarchical structure so you can 'program' the robot by touch alone. This only requires an MCU, but we are also using a RasPi with a camera and microphone to detect motion and shapes and respond to voice commands.

Testing the system on a Pi revealed the touch panels to be a lot more sensitive, and the cardboard literally came alive. I'm working on a way to integrate both these behaviours into the system.


Origaime - The next generation

The new parts build in exactly the same way as the first set, they fold up from a single piece of cardboard to make a modular piece for the robot. Mark calls this Origaime, after the original robot that inspired everything and the art of folding. :-)

Both these pieces were accomplished using the redesigned saddle that fits over the servo actuator and a stud fitted opposite it to make a live hinge.

The section that joins these together to make active limbs as also made from one piece. This part carries the servos and the stud for the saddle to rotate over. Eventually there will be a range of part styles that can make any articulation you like, not just basic limbs.

This has been reworked to include the servo bearers out of the chassis instead of separate pieces. This better fits the Way of Origaime...

As I said, these parts are modular. Here's the other end being prototyped onto the shell at the body end. I hope hip replacements get this easy in the future.

And the shell itself of course. This features a customisable panel in the bottom which is used to provide ventilation for the electronics and also structural integrity. This part has the greatest amount of stress to deal with and needs to be solid. I chose a pattern I knew would do this, but then I realised this could be used to aesthetic effect, and I'm going over a few designs for graphics in here. Watch this space... ;-)

Here's a picture of what it looks like after a couple of strong beers. Nobody's perfect. ;-)

It needs a lid still, that is for now a simple octohedral dome with a port for the camera. I'm still working on this too.

All together...

Read more »

aimos_core.ino

Arduino code to control servos synchronously from an asynchronous control structure

x-arduino - 19.04 kB - 03/23/2017 at 21:42

download-circle
Download

aimosdriver

Python code to interact with AIMos Core

aimosdriver - 6.60 kB - 03/23/2017 at 21:42

download-circle
Download

hackaday_theme.doc

MS Word format DOC containing replacement shell templates themed with a stylised HaD skull. Scaled for US Letter prints, will also print on A4 without modification.

application/msword - 699.50 kB - 03/28/2017 at 16:32

download-circle
Download

hackaday_theme.odt

Open Documents format ODT containing replacement shell templates themed with a stylised HaD skull. Scaled for US Letter prints, will also print on A4 without modification.

application/vnd.oasis.opendocument.text - 698.27 kB - 03/28/2017 at 16:32

download-circle
Download

origaime.doc

MS Word format DOC containing the full prints for Cardware Origaime v3.2 Scaled for US Letter prints, will also print on A4 without modification.

application/msword - 2.87 MB - 03/28/2017 at 16:32

download-circle
Download

View all 6 files

  • 1 × Atmega 1284P for basic system Or Arduino to host the Core systems : Nervous system, kinematics and digital sensors
  • 1 × Optional Raspberry Pi Zero, A, B, Camera etc Or Beagle etc to host the AI systems : Audio/visual interaction, learning, enhanced sensors
  • 1 × Software AIMos Core and AIMos UX, AIMil Language
  • 1 × Chassis Hardware Downloadable Chassis Templates, your own choice of materials
  • 1 × Tools No 1 Posidrive screwdriver, scalpel or craft knife, scissors, glue, Optionally PC for control and Core programming

  • I'm walkin here

    Morning.Star14 hours ago 0 comments


    I've finally managed to puzzle out the gimballing for the limbs to the point where I can address them from either end of the chain.

    I discovered it wasnt possible to mirror the mesh internally to handle standing on the opposite foot, as I'd planned. It became a nightmare involving such tricks as defining the second mesh inside-out, with anticlockwise polygons, but the gimballing then was reversed too and I gave up, I cant get my head around that.

    Instead, I've defined the robot from the left foot, which, when it isnt being stood on, calculates its position backwards from the other foot and then rotates the entire model to make that foot face downwards. Providing all the interstitial rotates are symmetrical, the feet remain parallel to the floor and the sum of the servo angles all add up to 180.

    This is highly useful, because it applies with the limbs in a position to step up onto a platform as well as with both of them flat on the floor... Using this information, I've defined a series of poses for the limbs that I can interpolate between and get smooth motion, I've called these seeds, and all the robot's motions will be defined using them.

    Here is a full seed set cycled around. I've tilted the display so the part rotates are clearer, but this isnt included in the world coordinate system yet and the position jumps. I'll be refining these, and adding Z rotates for the hips so it can turn. This adds yet another calculation for the far foot position in the mapping system and I'm not there yet.

    The next task is to add footprints to the world coordinates, which will enable mapping and platforms, but first I have to integrate the balancing routine and switch that on and off periodically. The seeds are intended to bring the system close to balance, so that I can also use inertial forces later in the development. This will be a matter of timing as well as servo positions; currently I'm working on mapping mode, where it has to determine if there is floor to step onto and thus balance on one foot.

  • Twas brillig, and the slithy Toves did gyre and gymble

    Morning.Star11/13/2017 at 09:48 0 comments

    Point Cloud Kinematics

    There's actually a lot more to it than meets the eye as far as information is concerned, but it's embedded and bloody hard to get to, because its several layers of integrals each with their own meta-information. I've touched on Cloud Theory before, and used it to solve many problems including this one, but for a cloud to have structure requires a bit of extra work mathematically.

    Our universe, being atomic, relies on embedded information to give it form. Look at a piece of shiny metal, its pretty flat and solid, but zoom in on a microscope and you see hills and valleys, great rifts into the surface and it doesnt even look flat any more.

    Zoom in further with a scanning electron microscope and you begin to see order - regular patterns as the atoms themselves stack in polyhedral forms.

    If you could zoom in further you'd see very little, because the components of an atom are so small even a single photon cant bounce off of them. In fact so small they only exist because they are there, and they are only 'there' because of boundaries formed by opposing forces creating an event horizon - a point at which an electron for example to be considered part of an atom or not. It's an orbital system much like the solar system, its size is governed by the mass within it, which is the sum of all the orbiting parts. That in turn governs where it can be in a structure, and the structure's material behaviour relies upon it as meta-information.

    To describe a material mathematically, you then have to also supply information about how it is built - much as an architect supplies meta-information to a builder by using a standard brick size. Without this information the building wont be to scale, even the scale written on the plan. And yet, that information does not appear on the plan, brick size is a meta; information that describes information.

    A cloud is a special type of information. It contains no data. It IS data, as a unit, but it is formed solely of meta-information. Each particle in the cloud is only there because another refers to it, so a cloud either exists or it doesnt as an entity, and is only an entity when it contains information. It is self-referential so all the elements refer only to other elements within the set, and it doesnt have a root like a tree of information does.

    A neural network is a good example of this type of information, as is a complete dictionary. Every word in the language has a meaning which is described by other words, each of which are also described. Reading a dictionary as a hypertext document can be done, however you'd visit words like 'to', 'and' and 'the' rather a few times before you were done with accessing every word in it at least once. You could draw this map of hops from word to word, and that drawing is the meta-map for the language, it's syntax embedded in the list of words. Given wholemeal to a computer, it enables clever tricks like Siri, which isnt very intelligent even though it understands sentence construction and the meaning of the words within a phrase. There's more, context, which supplies information not even contained in the words. Structure...

    This meta-information is why I've applied cloud theory to robotics, and so far it has covered language processing, visual recognition and now balance, and even though the maths is complicated to create it, cloud-based analysis of the surface of the robot is a lot simpler than the trigonometry required to calculate the physics as well.

    But its not all obvious...

    I first tried to create a framework for the parts to hang off of and immediately ran into trouble with Gimballing. I figured it would be a simple task to assign a series of coordinates from which I could obtain angle and radius information, modify it, and then write it back to the framework coordinates.

    This works, and hangs the parts off correctly using the axes to offset each part....

    Read more »

  • Pixelium Illuminatus

    Morning.Star11/08/2017 at 09:39 0 comments

    And other arcane mutterings.

    After meeting a dead-end in AIMos with the image recognition based on pixels, I realised I'd have to find a way to either make them triangular to use Euclidean math on them, or, find a way to make Euclidean math work on polygons with 4 sides to match the Cartesian geometry of a photo. Digital images are pretty lazy, just a grid of dots with integer dimensions reduced to a list of colours and a width and height.

    It isnt immediately obvious but that isnt how a computer handles a pixel on screen because of scalable resolution. Once inside, it has 4 corners with coordinates 0,0 , 1,0 , 1,1 and 0,1 and happens to be square and the right size to fit under a single dot on the display. The display is designed for this, and modern monitors can even upscale less pixels to give a decent approximation of a lower resolution image.

    This interpolation, averaging of values, can also be used to reshape an image by getting rid of the pixels completely, which turned out to be the answer to the problem.

    Cardware's internal rendering system hybridises Euclidean and Cartesian geometry to produce a bitmesh, which is a resolution-independent representation of a digital image. It cant improve the resolution of the image, so it works underneath it, using several polygons to represent one pixel and never less than one per pixel.

    This is achieved by using the original resolution to set the maximal size of the polygons, and then averaging the colours of the underlaying pixels. Then whenever that polygon is reshaped, it maintains the detail contained in it as well as the detail between it and its neighbours independently of the screen grid. Taking the maximum length of the sides and using that as the numeric base for the averaging does this a lot faster than Fourier, even Fast Fourier routines to abstract and resolve the pixel boundaries.

    Because the system now has an abstraction of the image, it can be played with so long as the rules of its abstraction are obeyed. Everything is in clockwise order from out to in, or left-to right and down as per writing, and has logical boundaries in the data that obey Cartesian rules. This means I can use Pythagorean maths, but handled like Euclidean triangles with unpolarised angles that are simply relative to each other.

    Triangles are unavoidable, but I struck on the idea of making them 4 sided so they handle the same as squares and dont require special consideration. A zero-length side and a zero theta does not interfere with any of the maths I've used so far, and only caused one small problem with an old routine I imported from AIMos. That was easy to write a special case for, and isnt really part of the maths itself, but part of the display routines.

    Here's a stalled draw from the Python POC code showing the quality the system can achieve. I was expecting Castle Wolfentstein, but this is going to be more like Halo, near-photographic resolution, and fast too.

    The Python that draws this is the calculation routine with pixels obtained from the source map and re-plotted. Once the polymap has been deformed by the mesh and rotated into place those pixels will be polygons and the holes will disappear. The original was 800x600 and takes around 12 seconds to fully render. Once in C++ this will come down to a fraction of a second for a photo quality shell-render of the entire robot, maybe a few frames a second if I'm careful.

    Not in a Pi though, so compromises will have to be made...


    Yeah I know, walking, I'm not ignoring it.

    Actually this maths is all related directly to it as well as the recognition system and the perceptual model I'm trying to build...

    OK so now I know where the actual centre of a triangle is without an awful lot of messing around. That's an equilateral and quite easy to calculate, but a right-angled triangle, of which you find two of in a square will give you...

    Read more »

  • Building Worlds

    Morning.Star10/30/2017 at 11:07 0 comments

    ...And populating them

    * Now with code to handle multiple objects, save and load objects to a human-readable file and rotate the scene with the cursor keys. This is as about as far as I am taking the Python code as it has served its purpose, to design and assemble the models. There is a bit of tidying up needed, a routine to attach the limbs in a proper chain using the joint angles to display the entire model... And another to export the parts as WaveFront object files, which is pretty easy. Then Cardware can be 3D-printed, milled and rendered in almost any rendering package commercial or otherwise, be simulated in many engines, and, represented by a networked Cardware engine and optionally attached to the sensors on a real Cardware model...

    Most of this code will be rewritten in C++ so I can draw those polygons with my own routines and handle the depth on a per-pixel level. This also means I can wrap a photograph of the real one around it. Python is way too slow for this without a library, and PyGame's polygon routines would still be terrible if they understood 3D coordinates. ;-)

    I suppose I have to make that skins editor I toyed with while printing the plans now, to make that easy. Double-sided prints are possible but are a bit of a pain to line up properly and I dont have cash for colour cartridges laying around, or at least some of my Metal Minister would have been photographic. Using a skins editor can make that simple, and it isnt just for pretty although that is a consideration;

    The main reason is to map the robot's markings to its geometry so it is recognisable to another. Because they are a shared system this information is available to all robots in the network so they can say this is where I am and this is what I look like, so the visual recognition system can verify it and link it to the world math. But, it doesnt have to be a real robot, it can equally be a simulation which the individual robot's 'mind' wont be aware of. Even the simulated ones...

    We beep, therefore we are. :-D

    To understand the world, and indeed itself, a robot needs to metricate everything and represent it mathematically. It is very important for movement that the robot also understands its perimeters and not just the angles of the joints, so it has situational awareness. So, the first thing I had to do was measure everything and create mathematical models for it to represent itself with.

    First the easy bits, simple to pull measurements directly off the plans in MM. I'm using that because it fits better into denary math and I can round it to the nearest MM without affecting the shape. The card itself has a 0.5MM width and the physical model can vary by that anyway.

    Turns out the model can deviate quite a lot from the plans even when they are printed and folded to make the actual model, and accuracy has little to do with it in practice on some parts. More on that later...

    The Thigh Module

    Hip and Ankle Modules are simply half of the above part, easy to generate a mesh for during import. The legendary Hip Unit (Units are unpowered, Modules are powered) was already measured to calculate the inner radius from the outer, a 2mm difference in diameter.

    The Hip Unit

    The foot is more complicated. Filled curved shapes are a nightmare to compute edges for, so I've broken them into a mesh. This was done manually in Inkscape from the original drawing.

    Overlaid with polygons and then measured, thats the foot done too.

    The Foot Module

    The lid was a little bit more complicated. While I can draw orthogonal plans to scale I'm not entirely sure thats accurate to the MM in all three dimensions. The original MorningStar was not calculated to be the shape it was, I discovered the fold almost accidentally and then figured out the mathematics for it, then used them to compute the lid dimensions as curves. Interpolations from that math was...

    Read more »

  • More Power Igor

    Morning.Star10/21/2017 at 10:36 0 comments

    Well, finally I was let down by a lack of amps, the batteries I have just arent strong enough to move the limbs without at least doubling them. And I'll need to upgrade the regulators. Such is life...

    I added a little bit of code to the Pi's monitor. The raw_input() function was going to be replaced by an open() and list() on a script containing mnemonics.

    def injector():
      global port,done,packet
      
      print 'Starting input thread.'
      
      joints=['anklex','ankley','knee','hipx','hipz']
      cmds=['movs','movm','movf','movi']
      syms=['s','m','f','i']
      rngs=[15,10,5,1]
      while not done:
        cmd=-1
        spd=-1
        srv=-1
        agl=-1
        print '->',
        i=raw_input('')
        if i!='':
          inp=i.split(' ')
          if len(inp)==3:
            srv=0
            if inp[0] in cmds: cmd=cmds.index(inp[0])
            if inp[1] in joints: srv=joints.index(inp[1])+1
            try:
              agl=int(inp[2])
            except: agl=-1
            if cmd>-1:
              spd=rngs[syms.index(cmds[cmd][3])]
            if cmd>-1 and srv>-1 and agl>-1 and spd>-1:
              checksum=1+srv+agl+spd # int(i1)+int(i2)+int(i3)+int(i4)+int(i5)
              chk2=(checksum & 65280)/256
              chk1=(checksum & 255)
              port.write(chr(chk1)+chr(chk2)+chr(1)+chr(srv)+chr(agl)+chr(spd)+chr(0))
              sleep(1)
        else: done=True

    This would allow me to program motions for the servos, if they actually moved. They hum, but they just dont have the juice.

    Closing existing port... Port not open
    Clearing buffers...
    Connected!
    Starting input thread.
    -> movm knee 255
    Servo: 10 10 128
    Move: 10 128
    Move: 9 140
    Move: 8 153
    Move: 7 166
    Move: 6 178
    Move: 5 191
    Move: 4 204
    Move: 3 216
    Move: 2 229
    Move: 1 242
    -> movm knee 128
    Servo: 10 10 255
    Move: 10 255
    Move: 9 242
    Move: 8 229
    Move: 7 216
    Move: 6 204
    Move: 5 191
    Move: 4 178
    Move: 3 166
    Move: 2 153
    Move: 1 140
    -> movm anklex 0
    Move: 10 128
    Move: 9 115
    Move: 8 102
    Move: 7 89
    Move: 6 76
    Move: 5 64
    Move: 4 51
    Move: 3 38
    Move: 2 25
    Move: 1 12
    -> movm anklex 128
    Move: 10 0
    Move: 9 12
    Move: 8 25
    Move: 7 38
    Move: 6 51
    Move: 5 64
    Move: 4 76
    Move: 3 89
    Move: 2 102
    Move: 1 115
    -> movm anklex 255
    Move: 10 128
    Move: 9 140
    Move: 8 153
    Move: 7 166
    Move: 6 178
    Move: 5 191
    Move: 4 204
    Move: 3 216
    Move: 2 229
    Move: 1 242
    -> movi anklex 128
    Move: 1 255
    -> movi anklex 0
    Move: 1 128
    -> movs anklex 0
    Move: 15 0

     This is the above code interacting with a live processor controlling the servos. For now I have just defined the following instructions.

    • MOVI - Move Instant; sets the servo angle to the destination in one step.
    • MOVS, MOVM and MOVF - Move Slow, Medium, Fast; sets the servo to the position specified interpolating by decreasing numbers of steps.
    • The servos are specified by name, AnkleX and AnkleY, Knee, HipY and HipZ, and angle is specified directly as an 8-bit number between 0 and 255.

    These execute over time inside the Atmel without further instruction, freeing up the ports and main processor for the task of observing the sensors and interjecting to compensate. It isnt ideal, but it is a lot faster than trying to directly control the servos over those serial links.

    I'm going to take a little break from this for a while. Or I will begin to dislike it... And I need to paint, and play my guitar some to chill out.

  • Broken but not bowed

    Morning.Star10/20/2017 at 13:14 0 comments

    OK so I've had a good cuss at those rotten processors. Other than commenting that 'Well, I think MOST of them worked when we packed them...' just isnt good enough when money changes hands over it I'm done with them.

    Maybe I'll come back to them later along with the multiprocessor networking, when I've had a break...

    Meantime, fuelled by panic and a weak but nonetheless calming solution of Ethyl, I rewired the processor board after snapping off and junking the ESP partition. I had to do this, I could not get either of the two ATMega 1284P-PU chips to program. One's in a Sanguino with Sprinter installed on it, I did that myself so I know its good. Except it isnt. I wasted an hour or two on it and that followed the ESP onto the pile of failed crap I paid money for. Four Duinos and two ESPs, a handful of voltage dividers and a few chips just lately.

    Oh well trusty ATMega 328P-PU's it is then. Each has its own UART and the control code on the Pi talks to the processor via the USB ports, leaving one spare for camera and one for WiFi dongle. There is the option of putting a precompiled solution on the ESP and interfacing to it via the Pi's 40-way connector using #pi2wifi .

    Thats now operational and contains a working servo sequencer.

    I have managed to rewrite the multicore_ini code to interface directly with the PC and got it working.

    #include <Servo.h> 
    const int host=1;                                 // this processor
    const int servos=5;
    // servo definition structure:
    // articulation
    //      |____ servo (servo object)
    //      |          |____ attach()
    //      |          |____ write()
    //      |____ pin (physical pin number)
    //      |____ min (minimum range of movement 0-255)
    //      |____ max (maximum range of movement 0-255)
    //      |____ home (home position defaults to 128; 90 degrees)
    //      |____ position (positional information)
    //                 |____ next (endpoint of movement)
    //                 |____ pos (current position as float)
    //                 |____ last (beginpoint of movement)
    //                 |____ steps (resolution of movement)
    //                 |____ step (pointer into movement range)
    //
    // packet configuration:
    // byte 1: header - 2 bytes checksum
    // byte 2: control - 1 byte packet type
    // byte 3: parameters - 1 byte meta
    // byte 3: data 1 - arbitrarily assigned
    // byte 4: data 2 - arbitrarily assigned
    // byte 5: data 3 - arbitrarily assigned
    struct servo_position {                           // servo status
      int next;
      float pos;
      int last;
      int steps;
      int step;
    } ;
    typedef struct servo_position servo_pos;          // atmel c++ curiosity, substructs need a hard reference
    struct servo_definition {                         // servo information
      Servo servo;
      int pin;
      int min;
      int max;
      int home;
      servo_pos position;
    } ;
    typedef struct servo_definition servo_def;        // servo structure containing all relevant servo information
    servo_def articulation[servos];                   // array of servo structures describing the limb attached to it
    int mins[]={ 0,0,0,0,0,0,0,0,0,0,0,0 };           // defaults for the servo ranges and positions
    int maxs[]={ 255,255,255,255,255,0,0,0,0,0,0,0 };
    int homes[]={ 128,128,128,128,128,0,0,0,0,0,0,0 };
    unsigned char check,checksum,chk1,chk2,ctl,prm,b1,b2,b3;
    void setup() {
      Serial.begin(115200);
      while (!Serial) { ; }                           // wait for the port to be available
      for (int s=0; s<servos; s++) {                  // iterate servos
        articulation[s].servo.attach(s+2);            // configure pin as servo
        articulation[s].pin=s+2;                      // echo this in the structure
        articulation[s].home=homes[s];                // configure the structure from defaults
        articulation[s].min=mins[s];
        articulation[s].max=maxs[s];
        articulation[s].position.next=homes[s];
        articulation[s].position.pos=homes[s];
        articulation[s].position.last=homes[s];
        articulation[s].position.steps=0;
      }
      
        for (int d=0; d<1000; d++) {                    // garbage clear
          if (Serial.available() > 0) { unsigned char dummy=Serial.read(); }
          delay(1);
        }  
    } 
    void loop() { 
      if (Serial.available() >= 7) {                  // if there is a packet
        chk1=Serial.read();                          // read the packet
        chk2=Serial.read();
        ctl=Serial.read();
        prm=Serial.read();
        b1=Serial.read();
        b2=Serial.read();
        b3=Serial.read();
        checksum=chk1+(chk2*256);
        check=ctl+prm+b1+b2+b3;
        if (checksum!=check)...
    Read more »

  • Wish I'd seen this coming

    Morning.Star10/19/2017 at 07:34 0 comments

    F*ing ESP8266, I've had my doubts about it right from the start.

    I built this to control the servos...

    Using the multiprocessor serial networking. The TX of each is on a flylead that plugs into the RX of the next, and its broken out on the top left corner so I can connect the TX and RX of the UART.
    Its powered by a 5V 2A supply, and the whole thing is paranoically well grounded and shielded to prevent any crosstalk.

    Here the ESP is connected directly to the UART and has been programmed with multicore.ino:
    const int host=1;
    const int packlen=1;
    unsigned char bytes;
    int bootdelay;
    void setup() {
      Serial.begin(115200);                           // open serial port
      while (!Serial) { ; }                           // wait for the port to be available
      if (host==1) { bootdelay=1000; } else { bootdelay=5000; }
      for (int d=0; d<bootdelay; d++) {               // garbage collect delay
        if (Serial.available() > 0) { unsigned char dummy=Serial.read(); }
        delay(1);
      }
      
    } 
    void loop() { 
      int val1;
      if (Serial.available() > 0) {                  // if there is a packet
        bytes=Serial.read();                          // read the packet
        val1=(int)bytes;
     
          Serial.write(bytes);                    // write packet
          
      }
    }

    Which has had everything stripped out so all it does is return what it is sent. It can be uploaded to the ESP and the Atmel without modification and should perform the same. On an Atmel, viewed from a simple port scanner and byte injector script:

    Port not open
    Clearing buffers...
    Connected!
    Starting input thread.
    Send command byte
    Byte : 1
    00000001 - 01 - 1  
    Send command byte
    Byte : 2
    00000010 - 02 - 2  
    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    Send command byte
    Byte : 255
    11111111 - ff - 255 - �
    Send command byte
    Byte : 
    Thread done!
    Main done!
    

    However, uploaded to an ESP (and I have now tried two, one from eBay and one that Ars sent me) it does this instead.

    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    00000000 - 00 - 0  
    00000000 - 00 - 0  
    11111110 - fe - 254 - �
    11111110 - fe - 254 - �
    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    Send command byte
    Byte : 3
    00000011 - 03 - 3  
    Send command byte
    Byte : 4
    10000010 - 82 - 130 - �
    Send command byte
    Byte : 1
    11000000 - c0 - 192 - 
    Send command byte
    Byte : 1
    11000000 - c0 - 192 - 
    Send command byte
    Byte : 2
    00000010 - 02 - 2  
    11111111 - ff - 255 - �
    00000000 - 00 - 0  
    00000000 - 00 - 0  
    11111111 - ff - 255 - �
    Send command byte
    Byte : 00000000 - 00 - 0  
    00000000 - 00 - 0  
    11111100 - fc - 252 - 
    11111110 - fe - 254 - �
    11111110 - fe - 254 - �
    11110000 - f0 - 240 - 
    11111100 - fc - 252 - 
    11111111 - ff - 255 - �
    11110000 - f0 - 240 - 
    11111110 - fe - 254 - �
    11111010 - fa - 250 - 
    11111111 - ff - 255 - �
    11111111 - ff - 255 - �
    00000000 - 00 - 0  
    11100000 - e0 - 224 - 
    10000000 - 80 - 128 - �
    11000000 - c0 - 192 - 
    11111000 - f8 - 248 - 

    Of particular note is the last section. While I was sitting and puzzling over where all the spare bytes were coming from, an few more appeared at random. Then a few more, and more as I watched. Eventually it filled up the terminal over about 10 minutes in random bursts.

    The Atmels dont do it using the same code on the same board, same power supply and connected to the same rig, so I really have no explanation other than I have two bad processors and I've wasted my time trying to get something broken to work. Its making up data on its own, no wonder I couldnt get anything meaningful out of perfectly good code - which I wrote blind I might add. Runs on the Atmels without modification, but I had to add garbage collection and checksumming just to get it to run with the ESP and now I know why.

    I've built two motherboards, one using brand-new hardware, hacked a test rig with flyleads and Arduinos and a ESP programmer board I built, nothing worked, finally I strip it all back and watch it taking the piss...

  • Make that a double Absinthe and a Hemlock chaser

    Morning.Star10/17/2017 at 05:35 0 comments

    Massive problems with that as a concept.

    Well the code works in theory, and testing the individual units together works, but I cannot get the ESP to sync with the Atmels properly. I think it may be to do with the mismatch in processor speeds, but I cannot get all three and the PC to sync up and send meaningful data to each other.

    The pccore python code is fairly simple, all it does is accept and insert bytes from the terminal in a thread on its own, and passes packets not for itself. It deletes any for itself, and displays what is on the network. The PC has a USART with its RX and TX inserted into the chain after the ESP to make a square with a device on each corner - eventually this would be a RasPi or something - and the ESP would inject packets from the WiFi, which the Pi would be able to service.

    As you can see, sending a packet from a processor to itself works with another processor connected to it;

    That packet says from processor 4 (PC) to processor 4, sensor data, 255 255 255, and always works with at most one processor plugged into the chain. The processor reads the header and bounces it, and the PC deletes it when it comes back so it doesnt loop. In theory, a chain of hundreds of processors is possible, however, add another in the chain and it usually works, add a third, particularly the ESP and it fails with garbage returned, and I cannot figure out why. It isnt my code, I cant tell if it's Arduino's or a basic hardware issue so I'm going to have to abandon the idea.

    Losing my processor solution this late in the game is a killer, I dont know what to do now other than directly hook 10 servos to one Atmel and drive them over a single RX TX serial link. I know from experience this will be rough and jerky, and probably fall over, but I have to try right?

  • Ordering a double Latte in C++

    Morning.Star10/12/2017 at 08:34 0 comments

    OK so now I'm wandering around muttering code. As you may be aware I design out of my head and document the results after building, and code is no exception. Somehow, I've managed to build a multiprocessor semaphore that ties two identical pieces of code to a third very similar one, all three run transparently and interact with each other and its puddled my brain lol. I dont even drink coffee either. ;-)

    The processor board has two Atmega 328 and one ESP8266, and they are networked together by joining the TX pin on each to the RX pin on the next to make a triangle with the processors on each corner.

    Obviously they will need to pass data internally from RX to TX, and this would loop forever without some kind of semaphore so I have defined a protocol.

    The data is sent using a 5-byte packet.

    Byte 1 contains the header - two 4-bit numbers that specify where the packet came from and where its going to. They are bit-wise so that a processor may address both the other at once by combining their ID bits.

    Byte 2 contains a command code and parameters - 3 bits tell the system that the packet is a servo instruction or a request for sensor data, or sensor data itself. 4 more bits optionally give the servo number if it is a servo instruction, and bit 8 is always spare for now.

    Bytes 3 - 5 contain data in an arbitrary format. For a servo instruction byte 1 gives the angle and byte 2 gives the number of steps to take to move to it from the current angle, byte 3 is blank. For a data request all three are blank, and for a sensor data packet the three bytes contain the three angles from the sensors.

    The Atmega 328 code

    Each processor reads its RX and looks at the header. If the source ID is the same as its own, the packet has made a circuit and is deleted. If it is for another processor it is passed to TX, after checking to see if it is also for this processor. if it is, the ID is removed from the header and the packet passed to TX. Any data for the processor is processed and sent to servos, or responded to with a sensor packet.

    //#include "EEPROM.h"
    #include <Servo.h> 
    const int host=4;                                 // this processor
    const int servos=5;
    // servo definition structure:
    // articulation
    //      |____ servo (servo object)
    //      |          |____ attach()
    //      |          |____ write()
    //      |____ pin (physical pin number)
    //      |____ min (minimum range of movement 0-255)
    //      |____ max (maximum range of movement 0-255)
    //      |____ home (home position defaults to 128; 90 degrees)
    //      |____ position (positional information)
    //                 |____ next (endpoint of movement)
    //                 |____ pos (current position as float)
    //                 |____ last (beginpoint of movement)
    //                 |____ steps (resolution of movement)
    //                 |____ step (pointer into movement range)
    //
    // packet configuration:
    // byte 1: header - 2 nybbles containing source id and destination id
    // byte 2: control - bits 1-3 packet type, bits 4-7 command parameters, bit 8 spare
    // byte 3: data 1 - arbitrarily assigned
    // byte 4: data 2 - arbitrarily assigned
    // byte 5: data 3 - arbitrarily assigned
    // eg 
    //        byte 1: 33/1000,0100 packet from processor 1 to processor 2 (esp to atmel a)
    //        byte 2: 9/100,1000,0 position command for servo 1
    //        byte 3: data 1 - angle (0-255 at the moment, will be percentage of range)
    //        byte 4: data 2 - steps (divisions in angular displacement)
    //        byte 5: data 3 - spare
    // eg
    //        byte 1: 33/1000,0100 packet from processor 1 to processor 2 (esp to atmel a)
    //        byte 2: 10/010,1000,0 request for sensor data
    //        byte 3: data 1 - spare
    //        byte 4: data 2 - spare
    //        byte 5: data 3 - spare
    // eg
    //        byte 1: 20/0010,1000 packet from processor 3 to processor 1 (atmel b to esp)
    //        byte 2: 12/001,0000,0 sensor data
    //        byte 3: data 1 - sensor 1
    //        byte 4: data 2 - sensor 2
    //        byte 5: data 3 - sensor 3
    struct servo_position {                           // servo status
      int next;
      float pos;
      int last;
      int steps;
      int step;
    } ;
    typedef struct servo_position servo_pos;          // atmel c++ curiosity, substructs need a hard reference
    struct servo_definition {                         // servo information
      Servo servo;
      int pin;
      int...
    Read more »

  • Tacking like a Galleon

    Morning.Star10/12/2017 at 07:52 1 comment

    This skips about a bit I'm afraid, I've got to the point where the hardware is just about complete and I'm tidying up the last major solutions before I get into heavy coding, although that has begun...

    I'm also working with Ars on his ZeroPhone documentation, so between code, electronics and artwork the rigging is beginning to creak. ;-)

    First up, power train.

    I've taken a large flat round magnet and stuck two cardboard discs onto the faces. Then its liberally covered with copper tape to give it a fully conductive skin. This comprises a prototype battery mount, made by rolling paper round the cells and magnet after applying a little glue. The battery is solid and makes a decent contact, but pulls apart easily.


    Following the theme of magnetics the battery snaps are also self setting. These are made with a spade terminal stuck to a magnet and then layers of copper soldered to the crimp (Quickly... Neodymiums denature and lose their magnetisation if you get them really hot...) before enclosing it all with a boot made of shrink tube with a hole cut in the edge.

    They just adhere to the battery and make a perfect contact. I'll make caps to fit the ends better eventually.


    The servos need 6v and there's ten of them, it will probably cook one regulator so each leg has its own. I didnt have a spare 5v regulator for the brain so the separate regulator with the large boot on it is a LM317t with a resistor divider on a bit of board. They are all 1.5a so they should be enough.


    The hips

    Plugging the tops of the hip bearing tubes proved easier than expected. I rolled up a piece of card really tight and stuffed it in the end of the tube until it jammed. Then I cut it off with a pair of sharp snips, nibbling round it until it went through. This left a furry mess, but a trim and 5 or 6 drips of superglue later its solid plastic, as the cyano wicks into the card and bonds it all solid. Be careful with thin cyano, this will get hot...

    It doesnt require drilling, just start a screw with a sharp point and it winds in there like a champ.


    The tops of the bearing tubes are faced with a foiled surface, I cut out the tube tops with a scalpel as this is all on-the-fly and not designed in the computer and printed off. Note the servo box, this took a bit of lateral thought as the original flapped around, being an open-topped box. The lid braces the servo as well as stabilises the walls.


    They have simple levers to drive the tops of the hip bearings. Ewww... Flapping canvas... That mechanism is older than I am, and both the servo and tube rotate in the same plane. Why are servos still boxes with levers? Bah!!! I cant make those tiny gear trains or I'd have hacked them a *long* time ago. ;-)


    It works, however overly complicated it may be.

    Both the lever parts are made by folding card into a zig-zag with 4 layers and soaking with superglue. They may as well be plastic, they behave like polycarbonate - flexible to a point but extremely stiff and strong. Also greaseproof, like plastic is and card isnt...

    I'm going to use this in all the servo mounts across the system as it solves the problem of screwing those down securely.


    The servo box has been redesigned to have several layers for strength and mounts securely in the right place, lining up with the bottom of the chassis even though its just glued into place. Simply supergluing the servo tabs together is enough to make them rigid; they also have three layers and a vertical brace.


    Next up, coding the OS and getting some movement out of it...

View all 29 project logs

  • 1
    Step 1

    First print out the full set of parts on 200-220GSM card stock. If you are using a dark coloured card, print the templates onto 85GSM photocopy paper and glue it to the card.

    Score all of the dotted lines well with a sharp point and a ruler. A ball-point pen is perfect for this as it also gives you a visual reference. (Thanks @K.C. Lee) Be careful not to score too deeply and damage the card so it tears. You just need it to bend in the right place.

    Cut out each of the parts and bend them into the basic form. It should be obvious where the tabs glue together by test-folding the pattern into shape.

    Wiring the antennae is fiddly but simple. Joint them all together and insert them into the part. Be careful to label or otherwise mark which wire is which or this will have to be done by trial and error in the software.

  • 2
    Step 2

    The modules are joined together with one screw through the saddle on one side that goes through a plastic bearer to stop the card wearing through. The other side of the saddle secures to the actuator arm on the servo with two screws.

  • 3
    Step 3

    Saddles are complicated but easy to do when you know how. Its just a matter of following the sequence of gluing because some of those joints are multi-layer and internal. It is logical however, if it doesnt fit you are trying to fit it wrong. I cant stress the benefits of test-folding the part to see how it fits before gluing.

View all 4 instructions

Enjoy this project?

Share

Discussions

Morning.Star wrote 03/16/2017 at 08:02 point

Cardware officially welcomes another collaborator.

Irony begs me to point out that I now have two cardboard Generals, Mark One and Mark Two.

XD

  Are you sure? yes | no

markgeo wrote 03/15/2017 at 06:54 point

Your latest update is even more awesome. I'm a version behind on my building due to not having enough servos. Currently waiting on more servos to arrive in the mail. Looks like I better place another order because I definitely want to try the Origaime version.

  Are you sure? yes | no

Morning.Star wrote 03/15/2017 at 16:17 point

Thanks Mark. Lol, I have the same problem. I so want one of these running around already!!

I've posted the latest generation files and instructions to give you a good head start. Mark Nesselhaus has fed back a few things from his build that I have incorporated into them. You'll also need an ATMega1284P or larger, say a 2560 based Arduino and the toolchain to make the sensory bus work.

* Spoiler alert * We are now into the stereoscopic vision system.

We are looking for collaborators to test our designs, you'd be welcome to a project log of your own describing your build. ;-) Welcome to the Way of Origaime!

  Are you sure? yes | no

Dave wrote 03/02/2017 at 17:03 point

absolutely brilliant mate nice work,    Gee, Brain, what do you want to do tonight The same thing we do every night, Pinky - try to take over the world! :) ;)

  Are you sure? yes | no

Morning.Star wrote 03/02/2017 at 18:43 point

Narf!

Cheers Dave :-)

  Are you sure? yes | no

markgeo wrote 03/02/2017 at 12:53 point

Wow. Your update shows how ambitious and awesome this project is. I will definitely be following along.

I had to do several print-cut-and-try cycles with slightly different scaling to get pieces to fit. I used some carboard shims under the servo mounting ears that not only fixed the fit but added additional material for the screws to bite, similar to what you did by doubling the bears.

Do you plan to release your code when you are finished?

  Are you sure? yes | no

Morning.Star wrote 03/02/2017 at 18:42 point

Thank you, thats kind. I've put an awful lot of research into this over years, and working with @Mark Nesselhaus has realised a lot of it. It would not have been possible without his input.

Hmmm. I would have to speak to Mark about that. While Cardware is intended to be open and flexible so anyone can use it, we're not just giving it away ;-)

  Are you sure? yes | no

markgeo wrote 03/02/2017 at 02:38 point

I really like this. I tried building it and found that the depth of the thigh is a bit shallow - the servos don't sit all the way down with the wires routed under them and a screw inserted below them for attachment to the yoke. It's a simple mod to increase the depth of the servo compartment.

  Are you sure? yes | no

Dr. Cockroach wrote 03/02/2017 at 08:02 point

Good morning, There will be some updates getting posted shortly as well as a surprise so check back later today :-)

  Are you sure? yes | no

Morning.Star wrote 03/02/2017 at 13:01 point

Hi Mark.

Thank you for your comments. I've added an update containing a properly scaled bitmap for those who wish to try this for themselves. There are also a couple of videos showing Cardware's nervous system in action. This is way more than just a model, it's living cardboard that can see, hear, feel and learn from you.

  Are you sure? yes | no

Andrew wrote 02/24/2017 at 01:49 point

Have you seen this? http://homofaciens.de/technics-machines-cnc-v3-0_en.htm

I have built several, and the thick card (1.2 or 1.5mm) with box construction is very strong.  You can see there are some pieces glued inside the hollow parts to strengthen it.  You could redesign your parts so that instead of folding, and instead of tabs, you have more, discrete, parts glued together.  Remember to compensate for the thickness of the card and take care to note whether parts are attached by their inside edge or outer face.

  Are you sure? yes | no

Morning.Star wrote 02/24/2017 at 09:26 point

Another incredible piece of work that only a handful seem to know about. I did a bit of googling to see if anyone else was up to this too, and found no-one. Seems they dont want to make a big deal of the fact that it's cardboard??

Thanks for the comments, always welcome. :-) This is still a proof-of concept and there is a lot of work to do. One thing that everyone seems to have missed is that Cardware robots do not live forever, but their parts do. It's aimed at younger makers and modellers, not just geeks.

There is a lot more to the project than these simple polyhedrons, we're trying to make card and paper interactive on a level that hasnt been done before. This limb is a chassis for a sensor system as well as imagination, combined with #AIMOS it brings cardboard to life.

  Are you sure? yes | no

Morning.Star wrote 02/24/2017 at 09:47 point

I like your #Raspberry Pi Fermentation Controller BTW. Brown water / Python, chuckle...

I built a computer controlled 'nanobrewery' called Brewnel a number of years ago to make beer from grain in a single tub, it contains an immersion element, filters and pumps, as well as a thermocouple and a weight gauge in the base to handle ingredient metrics. I should post this. The software was written in VB6, just before I met Debian Linux and I dont have the source any more, It will take more time than I have spare to rework it unfortunately, but it uses Arduino to switch relays for the hardware.

:-)

  Are you sure? yes | no

Radomir Dopieralski wrote 02/23/2017 at 22:29 point

By the way, have you seen this project? http://zoobotics.de/project/zuri-01-3/

  Are you sure? yes | no

Morning.Star wrote 02/23/2017 at 23:08 point

No, I hadn't! These are all my designs based on conversations with a man who likes to think outside the box, same as me. Thanks for the heads-up, I'll have to send in the flying monkey-bots to deal with them. ;-)

Luckily zoobotics manufacture their robots as kits. The entire point of Cardware is to remove the need for ordering physical parts, and expensive machinery to produce them in the first place. Things like die or laser cutters, 3D printers are not needed, just an ordinary printer. Also Cardware is designed to fold and glue, not assemble, and contains significant other advances over Zoobotics (Although that is stunning work) in it's interactivity.

Watch this space ;-)

  Are you sure? yes | no

Radomir Dopieralski wrote 02/23/2017 at 23:37 point

I'm not saying you stole the idea or they do the same thing, I just thought that you could actually steal some ideas from them, because they deal in a similar medium, so similar things will work. It might be even worth it to contact them and discuss some of the challenges they had!

  Are you sure? yes | no

Morning.Star wrote 02/24/2017 at 00:16 point

@Radomir Dopieralski I still like the flying monkey-bots plan myself lol. But you do have a point. Cheers for the interest...

  Are you sure? yes | no

Radomir Dopieralski wrote 02/18/2017 at 17:30 point

I spy a spider leg in there!

  Are you sure? yes | no

Morning.Star wrote 02/18/2017 at 18:25 point

Hey Radomir. Yes you do, this contains part of #AIMOS, which is what the first generation based on. I'm also going to recreate AIME in cardboard, thanks to Mark "#The Cardboard Computer - IO is my name" Nesselhaus. Cheers for the follow and skull, I love your #Tote

  Are you sure? yes | no

Radomir Dopieralski wrote 02/18/2017 at 18:35 point

Now I know who is Mark. All the best with your project, we need more spider robots!

  Are you sure? yes | no

Dave wrote 02/18/2017 at 15:12 point

Cant wait to see this in action :)

  Are you sure? yes | no

Morning.Star wrote 02/18/2017 at 15:51 point

Wont be too long Dave, we're now into prototyping the basic shells. There will be a few, I hope eventually people will make their own.

Thanks for the follow and skull :-)

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates