Close

Playing follow the leader

A project log for Lawnny Five

A heavy-duty robotic lawn tractor with interchangeable implements

jim-heisingJim Heising 04/27/2024 at 01:540 Comments

Made some create progress this week in adding more functionality to Lawwny 5 by starting to leverage machine learning and computer vision!

One of the goals I wanted to accomplish with Lawnny is to have him follow me around the yard without having to whip out a controller. I thought of the idea of using computer vision to recognize hand gestures and then to track my hand in realtime and that's pretty much what I've been able to accomplish this week.

I was able to accomplish this using a Luxonis Oak-1 Lite Camera (https://shop.luxonis.com/collections/oak-cameras-1/products/oak-1-lite?variant=42583148101855) and the excellent github project here https://github.com/geaxgx/depthai_hand_tracker.

Here is the basic way it works:

  1. The camera loads a ML model for hand gesture recognition.
  2. I've set the ML model to recognize any hand holding up the index finger in a "we're # 1!" pose.
  3. Any time the #1 pose is recognized I calculate the distance to my hand using the focal length of the camera and the size of the bounding box around my hand in pixels. Even though having a depth component in the camera would be nice, I'm very impressed with how accurate it is just using basic math.
  4. I feed the distance of my hand into a PID with a goal/set point of 1 meter from me.
  5. I also calculate the distance in pixels that my hand is from the center of the image.
  6. I feed the distance to center into another PID with a goal/set point of 0 pixels from the center.
  7. Each PID output gets converted into a ROS2 Twist message and published for the motor controller to pick up and move the robot.

The PID needs a little tuning, but as you can see it seems to be working pretty well!

Discussions