Close
0%
0%

DIY Golf Launch Monitor

Launch monitor using low-cost raspberry pi and camera hardware to determine ball launch speed, angles and spin

Similar projects worth following
The project's main goal is to produce software and hardware designs for a fully-functional golf launch monitor that avoids the need for expensive high shutter-speed and high frame-rate cameras. Building the system has required developing techniques for high-speed, infrared, strobe-based image capture and processing using cameras such as the Pi Global Shutter camera (~$50 retail) and Raspberry Pi single board computers. A first version is built and currently under test. See the Project Log pages for the latest news!

Ultimately, the project should be able to accurately determine ball speed, launch angles, and--importantly--ball spin in 3 axes. The output of the system will be accessible on a stand-alone web-based app. Outputs and interfaces to popular golf-course simulators systems such as GsPro are in progress and will be tested soon.

The project will likely be released as open-source, though patent applications on some of the IP have been filed to keep options open.

Overview

My personal goal has been to use this project as a learning platform for myself.  So far, I’ve been able to explore all sorts of software and hardware that I hadn’t worked with previously.  I’ve been kid-in-a-candy-store, getting to learn all sorts of new technology, techniques, libraries, and platforms.  This has also forced me to spin up on my linear algebra again.  

Technologies

Some of the technologies in use so far are:

  • Active-MQ (multi-protocol messaging middleware)
  • Apache Jakarta Server Pages (serves the web-based default system GUI)
  • Boost (various foundational libraries for logging, multi-threading, thread-safe data structures, command line processing, etc.)
  • Debian (Raspbian) Linux (OS)
  • JSON (serialization)
  • JMS (messaging)
  • KiCad (schematic capture and PCB layout)
  • Libcamera and Libcamera/rpicam-apps App (open-source camera stack & framework)
  • Maven (java build automation)
  • Meson & Ninja (build tools)
  • Msgpack (efficient architecture-independent serialization)
  • OpenCV (image processing & filtering, matrix math)
  • PiGPIO (GPIO and SPI library)
  • Raspberry Pi Single-Board Computers
  • Tomee (hosts the web-based interface)
  • KiCad (Schematic and PCB software)

The software is written almost entirely in C++, with the goal of having the codebase look close to commercial standards in terms of testability, architecture, and documentation.  There are some utilities (for functions like camera and lens calibration) that were easier to implement in Python and/or Unix scripts.

Cost

I’d like to have the entire system cost less than US$300.  I’m currently over-budget by about a hundred dollars, but have plans that should bring that price down.  For example, the new Raspberry Pi 5 should be able to support both of the system cameras, which could decrease the price by around $50 by avoiding a second Pi.

The high-level parts list at this time:

Additional Project Videos:

  • First Patent Application Published

    James Pilgrim06/06/2024 at 17:49 0 comments

    Our first U.S. patent application was published by the USPTO today.  

  • A Little Faster - Fastest Recorded Ball on DIY LM So Far...

    James Pilgrim06/06/2024 at 17:44 0 comments

    Here's the fastest recorded shot processed by the DIY LM so far.  136mph (61 m/s).  Still hoping to get something a lot closer to 100m/s.  I think it's the fastest back-spin so far as well.  Thanks to Dave L. for the help!

  • First > 100 mph Shot

    James Pilgrim05/23/2024 at 21:16 0 comments

    Ok, so that's not very fast, but when testing something today, I took a club and just tried to really wack the ball.  And I noticed that for the first time that I can remember, the system recorded a shot (barely!) over 100 mph (45 m/s).  Sadly, I can't hit the ball a whole lot faster than that, but it appeared to be recorded correctly.  There was also plenty of additional room on the screen for the strobe pulses to be pushed further apart and further to the right of the screen.  That means that higher speeds should not be a problem.

    Unfortunately, I had the "on" strobe set to an unusually long time (>30uS), so that may have contributed to the blurring that is visible in the ball-spin analysis images.  Which in turn resulted in an inaccurate spin measurement as seen in the mis-match between the second ball image and the as-calculated-spin ball (third ball image).  Still, kind of satisfying.

  • Faster Shot Processing

    James Pilgrim05/22/2024 at 17:23 0 comments

    The LM is currently down to about 5 seconds delay between hitting the ball and seeing the shot in the golf simulator.  This includes full processing, including 3D spin down to 1 degree in each axis.  See video here.

    This speed is not as fast as it needs to be, but is getting closer.  The recent speed up is due primarily to utilizing more processing cores, turning down debugging, and moving up to a Pi 5 from a Pi 4.  Additional speed increases should be available by decreasing the network traffic that is currently going on after a shot.  I still think the LM should be able to get to less than two seconds.

  • First Commercial LM (Uneekor XO) Comparison!

    James Pilgrim05/17/2024 at 23:34 0 comments

    Today for the first time, we were able to make a few comparisons of the DIY Launch Monitor's outputs to a commercial LM.  It’s also the first time to test in a more professional simulator environment.  The bed-sheet-over-PVC-piping-in-the-basement is still an option, but was pretty limiting.  And funny.  Instead, we now have access to a large simulator bay with the Uneekor overhead and running TGC 2019 (*).

    The results?  Well, pretty decent for a first try.  See the example video here.  Lots of work to do, obviously, and WTH is going on with the side spin (at least in this one test case)?  We also need to stop truncating the output to integers in order to have a more meaningful comparison.  Hopefully these aren’t difficult fixes.  I’m not too unhappy right now in any case, given that the material costs of the DIY LM are around 1/20th of the cost of a Uneekor.  However, I’m certain the DIY LM can do a lot better than it's current performance. 

    BUT, the main problem with the comparison environment was something I should’ve foreseen, but didn’t.  The Uneekor also appears to use an infrared strobe.  I never knew how it worked until I started this comparison, and it appears to run a high-speed IR strobe at all times once the ball is teed up.  This is wreaking havoc on the DIY LM!  See the bright-orange mess on the LM’s user interface in the video.  Current ideas include trying to notch-filter whatever wavelength the Uneekor is centered on, as well as some type of filter to remove the linear ghost images of the golf shaft as it moves.  I'm curious how other folks do comparisons of this sort (to the extent anyone has).

    ___________

    (*) I’d really like to complete an interface to TGC 2019.  The DIY LM already works with GSPro and E6.  But no one at #2k Games (pr@2k.com) answers our emails, and I can’t locate a business contact there.  Anyone who knows someone, please DM me!

  • E6/TruGolf Simulator Interface

    James Pilgrim04/27/2024 at 21:44 0 comments

    The first version of the E6/TruGolf interface to the DIY Launch Monitor is now working!

    BIG thank-you to the folks at TruGolf, especially Melissa and Ryan for their help and quick turn-around on questions.  I’m super-appreciative that they (and GSPro) have been so supportive of what is currently a small Launch Monitor project.

    Here’s a short video demonstration of a couple of shots:  DIY Launch Monitor - E6/TruGolf Interface 

    We’re under NDA with TruGolf, so I can’t say anything specific about their API.  However, suffice to say that it’s pretty sophisticated and exposes an impressive amount of functionality.  I hadn’t spent much time with their product before now, and wasn’t aware of its breadth and depth.  

    The LM’s interface is pretty minimal right now (mostly just player info and shot data), but it will be fun to flesh it out further in the future.  However, the next big push is getting the LM to be a lot faster when calculating shot data.

  • New Putting Mode and LM User Interface

    James Pilgrim04/23/2024 at 17:02 0 comments

    Ultimately, the system shouldn't care whether a player is putting as opposed to hitting a full-speed shot with a driver.  But for right now, it's easier to accommodate the different image processing parameters in those two scenarios by telling the system when you're putting.  I've got a plan to automatically switch between modes, but I'd like to get some higher-priority features going first, such as the TruGolfinterface and switching over to the Raspberry Pi 5.

    The system can be set to putting mode either by using the build-in user-interface (see near the end of the linked video), or by using the golf simulate (in this case, GSPro) to select the putter as the current club.  Once selected, the system configures itself for the slower putting speeds.  I've been able to accurately determine putts as slow as 1.5 mph, and as fast as almost 20 mph.

    The following video gives a little demonstration of a couple of putts. 

  • Printed Circuit Boards are Working!

    James Pilgrim04/17/2024 at 18:11 0 comments

    The first run of printed circuit boards arrived today from JLCPCB.  These are the boards that help connect the components of the system and also keep the high voltage strobe signals isolated from the fairly fragile Raspberry Pi GPIO circuitry.  

    The boards look good.  Especially for the very low price that #JLCPCB charges for small boards like this.  The entire thing, including (slow) shipping was a whopping US$ 3.64.  I wasn’t able to get anything locally (which I would have preferred) for less than $300!  

    Here’s the unboxing pictures.  I’m sure they’ll go viral soon...

    It only took a few minutes to populate the board.  By dumb luck, all the holes ended up in the right places!  Pretty surprising given that this was my first attempt at having PCB boards manufactured and also because I'm using an unfamiliar CAD system.  Of course, it's a pretty simple circuit, so maybe even I couldn't screw it up. ;)

    Here’s the finished (populated) and empty boards:

    Most amazing of all, is that everything tested out and worked on the first try:

  • A Preliminary Accuracy Analysis - Spin Measurements

    James Pilgrim04/14/2024 at 20:31 0 comments

    Background:

    Several folks have asked about accuracy and precision.  That’s a big topic, so I decided to start on only a single aspect – spin analysis – and just provide a couple of examples.  I haven’t worked out a formal, full, math-based error analysis yet, and am not certain I can. Instead, this log entry will just focus on providing anecdotal descriptions of the errors in what I believe are ‘typical’ scenarios.  I’m not even sure that a math-only analysis is a reasonable goal – there are an awful lot of case-specific potential places for errors to creep in, and it may not be possible to enumerate them all. I’d also probably have to go back to my grad-school books on error analysis and remember all of that. Ugh.  Of course, it would be great if this was all open source someday and some smarter folks could dig into the math and tear it all apart!

    Another important background point is that a real “ground-truth” against which to compare the system is hard to find.  You’d literally have to go to a real golf course, setup the LM(would have to be a very cloudy day to prevent the camera from being flooded with IR), and then measure the real-world distance, angle, etc. on the course and compare it to the results of the physics model in use by the LM/Simulator.  I don’t have those facilities yet. :/  As an alternative, the analysis here will be focused on comparing a best-estimate of the ‘’correct’ ball spin using visual techniques against the ball spin determined by the LM.  We’ll also make a comparison of how those difference(s) manifest themselves in pseudo-real-world coordinates in a simulator (here, GSPro).  The carry distance will also, of course, be determined by the determined launch angles and velocity, but we’re focused on spin here.  I’ll examine those measurements and the related accuracy in a later log entry.  So, let’s start…

    Analysis:

    Spin speed accuracy is primarily governed by the accuracy of the angular delta measurement between the ball exposures and the accuracy of the determined time delta between the exposures.  The former is currently most important.  Precision is limited by the fact the system only deals with degrees of spin as an integer, so 1 degree is the upper limit on precision.

    Let’s start with the following first example, where the strobed ball images are shown below.  I chose these examples because they have a ball with strong registration markings that ended up facing the camera.  That makes for easier visual analysis of the spin.  These were also images where the Z-rotation was fairly substantive (these shots were using a 7-iron combined with crappy golf skills), and the X and Y rotations were near-zero, making it easier to focus on the back-spin.

    The system chose two ball exposures from the above image to use for spin analysis.  Those are the first two images, below, and labeled ‘1’ and ‘2’ above.

    From these images, the system determined XYZ Angles (in degrees) of -5, 0, and 37.  The -5 is side-spin, and the 37 is backspin. The 0 y-axis is “rifle-bore” spin, which is usually near zero.  The system determined the time between the two images to be 3500uS (3.5 ms).  The dark penumbra around the balls is a result of de-rotating the balls to a common frame of reference.  That de-rotation (reversed later in the processing) creates some dark edges because some of the back of the ball (to the camera) has no information when rotated around to the front in 3D.  The small dots (especially on the third image) are a result of the 3D/2D projection/de-projection operations.

    In order to provide a visual sanity-check for the player, the system reproduces the first ball image rotated by the determined angular deltas, and shows that image as the third image above.  In a perfect world, the result of rotating the first image by the delta angles should produce...

    Read more »

  • GSPro Interface Working

    James Pilgrim04/03/2024 at 18:11 1 comment

View all 14 project logs

Enjoy this project?

Share

Discussions

jjerome80 wrote 5 days ago point

Very cool project.  I'm curious what the expectations are for this project?  Is this expected to compare with the $300-$500 launch monitors available or would this have hardware to compare to $1000+ monitors on the market? 

  Are you sure? yes | no

ruezy wrote 04/25/2024 at 19:28 point

Great stuff and I wanted to comment some relevant info if anyone tries to go down the road of using AI models to help with the prediction. I assume this would lead to being able to make the predictions more accurate with less costly hardware, if some smart people were up to the task.

I found some relevant info from a video where someone uses similar tools to track players on a football field which brought up some interesting resources.
https://www.youtube.com/watch?v=neBZ6huolkg

A dataset of 1400 golf swing videos.
https://www.kaggle.com/datasets/marcmarais/videos-160/data

He uses Yolov8 deep learning object detection model, which seem to be something OpenCV handles already and it seems if anything Yolo runs faster and can be fine tuned.
https://github.com/ultralytics/ultralytics

  Are you sure? yes | no

James Pilgrim wrote 04/27/2024 at 16:41 point

How cool - thank you!  I've been thinking about this all morning now, and there's a number of great projects that I could imagine trying.  I haven't learned Yolo yet, but I can't wait to start digging in.  Of course, I'd better finish the Launch Monitor first... :/

  Are you sure? yes | no

robhedrick wrote 04/21/2024 at 18:24 point

How's this coming along? I would love to see some code! 

  Are you sure? yes | no

andrew wrote 04/06/2024 at 04:56 point

This is looking really good! Do you have a GitHub repo yet?  I have been looking for a backyard setup. 

  Are you sure? yes | no

James Pilgrim wrote 04/06/2024 at 13:32 point

Not quite ready for a public repo yet, but hopefully someday not too far in the future.  Outdoor setups may be better served with a radar-based LM.  The DIY project is currently very sensitive to large amounts of IR light.

  Are you sure? yes | no

James Pilgrim wrote 04/02/2024 at 16:40 point

Hi all!  Apologies for being pokey about responding to everyone's DMs.  I've ALMOST got GSPro integration working, and will hopefully have a little demo video out soon showing some actual shots on (simulated) greens. 

I hope to provide some thoughtful responses to folks as soon as I get the GSPro connection working consistently.  (And thanks to the great GSPro people- they've been very helpful).

  Are you sure? yes | no

Eric wrote 03/27/2024 at 12:54 point

Project looks great. In the latest photo looks like your hitting with an actual golf club. Just curious, are you still planning on open sourcing this?

  Are you sure? yes | no

James Pilgrim wrote 03/28/2024 at 13:39 point

Yes - at this point, all the testing is with real clubs and golf balls.  Although I'm still searching for enough room to try out a big driver.  That will be necessary to prove out the high-speed capability of the LM, which is only theoretical at this point. ;/  I may have to borrow a friend with some real golfing skill as well, given that I'm not sure I can hit a ball anywhere near 100m/s.

The intent is still to open source the LM.  That said, there have been a couple of recent developments that might affect that, but seems unlikely to change.

  Are you sure? yes | no

camab wrote 03/24/2024 at 16:42 point

This is really cool. Something I've been wanting to do for awhile now but never attempted it. How are you handling the image capture timing? I think I've read that Skytrak uses a laser curtain to tell when the golf ball has been hit. Are you just having the first camera detect a golf ball via machine learning and then wait for it to move to trigger the second camera? 

  Are you sure? yes | no

James Pilgrim wrote 03/26/2024 at 03:01 point

Yes - The first camera just watches in a tight loop for any ball movement.  Once it moves, the second camera is triggered.

  Are you sure? yes | no

James Pilgrim wrote 03/23/2024 at 15:25 point

Ha - Thank you - I can't wait to start using it with a real golf simulator setup myself! :)  Still lots of testing to do, but am edging closer to using it in a simulator bay with a driver and a big screen every day.

  Are you sure? yes | no

Dylan Walsh wrote 03/11/2024 at 23:27 point

This is so freaking cool. Can’t wait to try this out.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates