• More stuff, no new code yet

    Jacob David C Cunningham10/16/2020 at 21:35 0 comments

    So I printed those parts out, and then I folded a standard 8.5" x 11" piece of printer pay in both x/y (or is it z) axes. Drew lines, these are obvious references. Generally I mounted stuff against these lines. I didn't really measure(I was supposed to but lazy). I have a rough idea though based on the paper dimension.

    I glued them on, I know the part dimensions since I made them in Google SketchUp(CAD program). I set the "unit" back a certain distance(I put some basic markers eg. 6" away(too close), 1' away(usable) and 16" away.

    I'll go over some concept ideas of what I intend to do, these steppers have a terrible slop in the gears. I'm talking 3-5" of swivel between the gear slop and linkage slop...

    One main thing I don't know right now is how to guess how far away something is as a starting point. I mean when you look at something, if something is huge you would think "that's close to me" but it could be massive and be far away.... small and still be massive and far away...

    But... this is generally intended to be indoors and I can do some preliminary bounding/scanning as a combination from CV and physical distance measurements(LIDAR sweep).

    Gahh "golden rule" come in handy right about now regarding mixing sig figs/errors.

    edit: this green part may not make sense, I'll probably have to elevate the base up so when the sensors are level, will be the "center" of the rhombus thing.

    Some major goals are being able to do the shape finding/contour/area/blob/etc... with OpenCV then also labeling(text tagging on image by coordinate). I'd like that so easier to know what's up with an image. From that can do the sizing/pixel measurements and determine angles to move on the stepper, etc...

    Also I want a GUI, desktop preferrably so will probably try and build something with a C++ graphics library(maybe wxWidget). This UI would be for the calibration/seeing an interactive program in real time vs. having to write code(after the code exists). This is pretty ambitious since I don't really use C++/that low level of stuff, browser-based sure no problem... I'm trying to avoid going that route but will see. Would like to put down some time to learn C++ in a real application.

    I also have to figure out some way to mock the pan/tilt/stepper motors because it's annoying trying to code by SSH on the Pi(nano), so slow. Also slow to code by Filezilla/direct edit.

    The dimensions will be related, so there will be a "center sensors" function that will look at the calibration guide above(raised platforms) and it will center against the middle rhombus thing. It's all rough dimensions... then will have max arc-sweep and track position... will be loss/drift/etc...

    Cat tax heh

  • Printed top layer, in one piece now

    Jacob David C Cunningham10/13/2020 at 00:01 0 comments

    I have lost interest in this thing... the desire is still there for 3D mapping but I am still very much an OpenCV noob. This method is probably dumb/bad. Still I like the idea of "real dimensions" vs. estimated/assumed from say stereoscopy. Granted I am not  expert in any of these fields... just bs.

    Right now I'm printing some shapes with known heights. I made this basic L-shaped thing so I had a base/platform to mount those raised shapes/angles on. Then the distance sensors will calibrate against it. I'll give myself a handout by making the shapes have a black surface(Sharpie) so it sticks out against the white paper background.

    I'm super beat today so I can't really think anymore, just wanted to post something. I'm actually trying to push this project aside, while making some notable progress(like the calibration GUI written in something that's not web-based or a web-wrapper). That's a learning thing, give myself false sense of value on the market.

    The other project is a true 4-legged robot with 12 servos. I saw some videos on YouTube showing it's possible to get decent movement with the cheap 9g servos and 4 legs... this time I'm using one of those 18650 cells on purpose with a boost converter. Plan is to use a sweeping ultrasonic sensor for navigation. I also got a little BLE module I've never used one before it looks pretty cool/simple of a device(4 pins).

    I could have switched these to servos but honestly done with it. Will spend more time on the code than the build... the top piece since the thing is solid(battery area is not glued) it took 7 hours to print that... the front-most vertical supports are extra... also the camera mount is backwards and not tall enough(ribbon cable).

  • Current state

    Jacob David C Cunningham09/29/2020 at 23:47 0 comments

    I realize this probably has major flaws, something that is 10ft away vs. 5ft away will have different measurements anglewise, so idk... it might be a very dumb design/idea guess I'll find out at my own expense. I am aware spinning ones exist that give you a nice real time shape of the surroundings, at least in a 2D plane, this is not going to be fast. As mentioned the computation is the Pi Zero which is laughable, will probably use a remote computer(full sized Pi on same network).

    Anyway it's mostly an excuse for me to learn OpenCV and then also to give my robots a brain since they just run into stuff.

    "center of the centroid"

    I intended to have these bumpers like how 3D printers "zero" themselves. But I'm not going to invest much more time into this thing since it's a bad design to begin with. I'm just going to work on getting a thing that's in one piece, can control two continuous-rotation servos(wheeled robot) and I'll calibrate it with a printed board/surface with known heights/distances. The camera will face that irregular platform and find the centers of those and calibrate, will use easy colors eg. red/green/blue.

    Edit:

    One problem/consideration is to use two "computers" or is it one and then a microcontroller. I liked the Arduino for things to do with servos and also the analog aspect(no need for ADC on Pi) but the issue is the communication between the two things(by serial). I had problems with hardware serial on the Elegoo Arduino Nano. The "brain" or RPi would do the calculations and then send that info to the Arduino which would then control the lidar servos for pan/tilt and the robot(servos/wheels).

    I also thought about how to have the Servos directly mounted to the axis and not take up significantly more room using bevel gears and direct mounting.

    Ehh it still looks bad/takes up too much room