05/18/2022 at 22:33 •
I happened upon some brilliant code on github for doing optical flow in the browser. Open this URL in your smartphone, then go to the "Zone" demo (the Pong one is fun but doesn't give the detail to show what is happening). Allow it to access your camera, then instead of moving your hand, as instructed, just tip the phone slowly left and right, as if it were being moved on a robot. Note that the yellow dot does a good job of tracking rotation.
If you move too fast, it loses track, but that can be adjusted, and for a slower moving (careful) version of the bot, it might be just fine, at least when combined with the compass and other sensors in the phone.
The code is open source (note the Fork me...)
Just another example of the amazing robotic control things that can be done in a smartphone browser without installing anything.
05/16/2022 at 16:02 •
I purchased a set of these sensors for $10+ on Amazon:
The GL5528 seems to work best, just plugged in between pin 7 and ground. Then I set pin 7 as an input with pull up and was able to sense light and dark, just passing my hand over the sensor.
- Add another channel for the clock
- Write some Arduino code to show the state of those pins on a graph and try different transmission speeds to see if the signal gets through at "fast enough" rates.
- Try the wire library to see if I can receive I2C data
My time is stretched to the breaking point right now so if someone wants to help out by duplicating this and doing those steps and sharing the code / results I would be VERY appreciative.
04/12/2022 at 17:26 •
Add a microswitch at the bottom of the slot where the SmartPhone is inserted so it automatically powers up the bot when the phone is inserted.
The script on the web page can listen to it's accelerometer sensors and send commands for short forward and back bursts to the motors in opposite directions; e.g. try to twist back and forth. If the sensors don't see that pattern, then the phone isn't in the robot or the bot isn't sensing it correctly. E.g. the phone will know it's in "not inserted" mode and keep sending that pattern looking for motion.
On power up, the bot will expect to see that pattern and will use it to set levels on the A2D. This will compensate for different screen brightness, lighting conditions, alignment, etc... Once it configures, it will start moving, and the phone will feel that "I'm in! Switch to ready to go mode"
If during normal operation, a motor command doesn't generate the expected motion, it can initiate a test "twist" and if that fails, switch back to "I'm out!" mode.