Close

Project Log 3: The brain must Press X to jump.

A project log for DIY Human-sized Mech

Mechs are not viable, nor cheap, so I will try to design and build one alone anyway.

FulanoDetailFulanoDetail 11/29/2022 at 14:384 Comments
Tuesday 10:53, 29/11/2022 - I'm still procrastinating

Another thing that a lot of people (including me) don't think much about humanoid mechs is how the squishy hummies control them.

your first thought is maybe control them just like you would control in a video-game: use joystick to look around, use the other joystick (or arrows) to move around and other buttons to make other actions, such as jump, open doors etc.

The problem is that these actions work on videogames because you don't have to take irregularities on the floor elevation, speed, balance control, force applied to each limb and a myriad of different things.

Inside a game you're a floating camera and the animations you see are just visual facades to trick you into thinking you're actually opening a door, kicking a wall and so on.

In order to make a robot to move around by its own with simple controls such as arrow keys (or A, W, S, D keys respectively), you would need to fit it with some kind of sensor to calculate depth and detect the position of the body in real time, and other types of sensors to provide other types of requirements.

One method is through the use of LIDARs (a RADAR that uses laser instead of radio waves) and encoders.

Self-driving cars use these and these sensors are really, really, really expensive, and even more expensive to be processed in real time.

Yes, maybe you could use cheaper alternatives, like using a bunch of cheap laser rangefinder/telemeters online and give the task of a computer to figure out how to read all that information in a useful way so they can somehow figure out how to walk.

Anyway, all of this just to say that is really, really complicated to make these a*holes to walk.

Or maybe not, these guy were able to do it with a cheap arduino and gyroscopic sensors (I think).



And don't even get me started on mind controlling electronic stuff.

Yes, there are sensors that can read the signals in some parts of your brain, but these signals are super mega simple and can make only simple tasks, like turn on or off something.

And there is also muscular sensors that can detect the contraction of your muscles and give it orders to robot prosthetics, but I doubt it is possible to make complex motions.

In these really cool videos it is shown more or less how to make one of these sensors do work, but even then, you can see that some times the guy lifts his entire arm and the sensor doesn't activate.


I thought on an alternative and it will kinda make a crossover with another project of mine that I'm definitely not procrastinating and avoiding doing it.

It is a full body VR haptic suit, the idea is to use cheap piezoelectric buzzers that you can buy online to make a VR suit that allows you to feel and move in the virtual world.

Piezoelectric buffers are really useful and versatile, you can use them as either as a sensor, a vibration feedback and even as a electric generator.

Well, in my empty head of mine, it seems something quite feasible to do, I just don't know anything about programming and other stuff.

However, if one can make a virtual avatar to move in a virtual world with this haptic suit, maybe you could control a human sized mech.

And that would also kinda cheapen up a little the controls and "brains" of the mechsuit, afterall, you will be the one controlling the body, the computer won't need to figure out how to walk, recognise floor elevation and all that stuff simply because you will be doing all of that.

And humans kinda "train" their entire lives to walk, tactile sense and manipulate things.

But well... I don't see any piezoelectric VR suits, do you? It is probably a stupid idea, who knows...



Also, just to refresh your minds, let's remember the megabots giant robot duel that happened a couple of years ago, and oh boi...

I know they put all the effort in the world just to make their dreams real, they definitely are way better professionals than me and would be able to outsmart anything that I can come up with...

... But...

... Trying to control a humanoid mech with buttons and joysticks is really awkward and hard, not the best joice, I would say.

Discussions

dekutree64 wrote 12/01/2022 at 04:41 point

Yeah, the control issue is a big one for a full body mech. It may be worth researching how it's done in Gundam and similar series :) Most likely both the creators and the fans have put a lot of thought into it.

One option for direct joint control is to use absolute orientation sensors to create a motion capture suit. It's quite simple actually. You just need to define the hierarchy of the joints, and transform each sensor's orientation by the inverse of its parent's orientation. Then you can convert it to 3 euler angles to command the servos. Here's an old proof-of-concept test I did with one on the forearm to serve as the frame of reference and one on the back of the hand to measure the wrist angles https://www.youtube.com/watch?v=EoIBsBOmWLc

After that I made a modular wireless version, which is basically finished but it's been a while since I worked on it. I need to make a project page for it, And I've since learned how to design PCBs, so I'd like to make an integrated sensor module. As it is you have to solder a bunch of little wires between an Arduino, IMU board, and NRF24, so a whole body set would be time consuming. Stomach, chest, head, and 2x upper arm, forearm, hand, thigh, shin, foot. 15 sensors giving 34 independent axes in total, plus more if you add fingers and/or toes. For basic finger control you can just use a flex sensor on the first joint. Thumb could potentially use an orientation sensor for the first joint and flex sensor for the second joint.

  Are you sure? yes | no

FulanoDetail wrote 12/01/2022 at 10:50 point

Wow! You really know a lot about the subject, I hope I can use a little bit of your help in the future :)

  Are you sure? yes | no

FulanoDetail wrote 12/01/2022 at 11:35 point

I didn't start to research about the absolute orientation sensors (your video is really cool by the way), but as I could read from the Gundam Series, the controls are really like what a game is like.
The mobile suit has pre-programed movements that are activated when pedals, buttons or joysticks are pressed/moved and the pilot can select specific joints in order to change a movement on the fly in about a quarter of a second.
And a lot of the work is actually done by learning AI that can interpret the commands of the pilot just like the fly-by-wire is used on today's fighter jets.
I don't think I would be able to pilot or build a system like the one in a gundam suit, it seems really complicated (but cool nevertheless) xD

  Are you sure? yes | no

dekutree64 wrote 12/01/2022 at 16:00 point

Interesting :) But yeah, sounds hard.

That old video was using BNO055 sensors, but they're no good for real use (and seem to be prohibitively expensive now anyway). They continuously update their calibration values to try and prevent any drift of the angles, but it only works if you stop moving occasionally. Otherwise it just gets worse and worse and then takes a while to recover.

The newer version uses RTIMULib2 running on Arduino to process the data from an accelerometer/gyro/compass sensor. It was more stable, but I remember at some point it seemed like the angles were lagging behind the real position when printing to serial monitor. I need to get it set up to command a servo and see how responsive it is.

It's been so long, there may be better options available now. Although a quick look around seems more like everything is out of production and there are no cheap 9-axis sensors at the moment.

  Are you sure? yes | no