FPGA Tracking Video
I received a De0 Nano development board from Newark and was having great difficulty getting my head around the Verilog code_ing.
Then luck and great fortune... Parallax opened up a "Jar of Grubs" by releasing a Propeller Emulator for FPGA systems.
It was not long before realised that I could load my FPGA with the propeller communications service I had written for GroGs MyRobotLab giving the De0 Nano an easy way to access the outside world, using Java and Python code_ing. (It still confuses me as too which language I aught to be programming this whole thing in).
So with 0% extra work the MyRobotLab framework is also accessible too FPGA systems (via the Propeller Emulator). (three birds with one stone).
The Robotic Tracking system :-
Consists of a Webcam mounted on my Plexi "PlankOTilt" camera gimbal.
I designed these a while back and they were laser_ed in plexi_glass.
Its designed so that the webcam is at a central pivot giving best tracking possibilities.
The De0 Nano running the Parallax Propeller Emulator.
As you can see the MRL communication system takes 4 cogs to run, yes the FPGA by default supports parallel processing.
The interlink cable was spliced from an IDE hard drive cable (pin4pin perfect)
The MRL software uses an OpenCV service to extract and process the webcam video stream.
The tracking service can detect points in the video stream,it also is able to detect faces/bodies/colours/floors etcetc.
A small bite of Python code is used to extract the tracking details and processes them to drive the servo gimbals.
MRL has PID (Proportional Integral Derivative) services that allow the servo movements to tuned for best speed and control.
The end result is the camera centres on the object not letting it out of its sight.