Latest Updates

A project log for Lawnny Five

A heavy-duty robotic lawn tractor with interchangeable implements

jim-heisingJim Heising 04/22/2024 at 18:220 Comments

It's been a while since I've posted, but that doesn't mean that Lawnny 5 hasn't been a very busy robot. Here is an update of progress:

Active Duty

One of the reason's I haven't been posting as much is that the recent progress on Lawnny has been a bit boring, but also extremely encouraging as well. What I mean by that is that he has been doing boring, but very real and useful, work in the yard almost every weekend for a few weeks in a row now. Reliability has been stellar, and he's pretty much fulfilled all of my original goals for the project. Here are a couple of videos of him in action:

That being said, even though he's doing regular and useful work, I still have a lot more planned for him.

Dethatching, Aeration and Spreading

One of the jobs I do twice a year is to dethatch, aerate and fertilize, and as I was doing laps around the lawn a few weeks ago I had a sudden thought— why the heck am I doing this when Lawnny could be doing it? I have no idea why it didn't occur to me sooner. Anyway, because Lawnny is designed around trailer hitches and is meant to function identically to a ride-on mower/tractor it was an easy solve. Currently I'm looking at purchasing and hooking up one of these:

Should hopefully be an easy hookup, and will post some video soon.


This has been a lot slower going than I would have ever thought. The main reason behind this is the sorry state of affairs for robotics simulation for people on Mac machines. I decided very early on that I wanted to do a lot of autonomous software development using simulations before I got anywhere near doing it in the real world, so I knew getting a reliable simulation environment up and running was going to be super important. What I did not realize was how ridiculously bad support is for doing this on an Apple Silicon. It's a long story which I may write about in the future, but at this point I have to run a virtual machine within a virtual machine (Docker running in UTM) in order to get anything to work, and it's still very flaky. And worse, it took me nearly a month to get to the point of even being able to run a simulation. As for simulation environments I've chosen to use Webots— Gazebo just seems overly janky to me.

<small_rant>I have to say that my experience in open-source robotics software over the past few months has been anything but pleasant, and it shows how ripe this area is for disruption. Obviously I can't complain— I'm using free software that people are spending their free time building— but it's just surprising. When you compare the open-source robotics ecosystem to something like open-source web application frameworks (the stuff I'm used to) it's striking how far behind robotics is. Granted, robotics is a much more complex problem than building web apps, but even the newest of web frameworks seems lightyears more refined than some of the most mature robotics projects. After learning more and more about the robotics industry during this project I suspect it really just has to do with economics— very few robotics companies actually make money and VC investments are becoming harder and harder to get. But I am convinced this will change eventually when we hit an inflection point, but who knows how long that will take.</small_rant>

First Steps Towards Autonomy

I've decided that my first step towards autonomy will be to use my Luxonis Oak-1 Lite camera, and the amazing work of this repo to be able to get Lawnny to follow me around the lawn using hand signals. Here is a basic idea of what I'm thinking:

  1. The mobile app will be used to set Lawnny into "follow-me" mode.
  2. This will start up a ROS2 lifecycle node that will run some of the code in the depthai_hand_tracker repo.
  3. Holding up my index finger in a "we're number 1" pose will cause Lawnny to follow my finger and try to center it horizontally by turning left and right. I assume I'll have to setup a simple PID loop— not sure if using something like Nav2 is overkill in this scenario. Any other hand pose will cause Lawnny to stop.
  4. The Oak-1 Lite camera is monocular and does not contain a depth component, so I don't necessarily have a sensor with which to determine the following distance from a person's hand. I am thinking I will just use the size of the bounding box around the hand to determine a relative distance— i.e. move forward until the bounding box of the hand takes up X percentage of the camera frame.

What do you think? Are there better ways of solving this?