A few people previously have wondered why I'm going through all this hassle when there're plenty of cheap ATMEGA or PIC-based frequency counter projects around. The answer is, simply, that all of those solutions aren't very good when you're looking for something capable of ridiculously high precision, which is my primary goal, or they're unwieldy to use.
Running a signal directly into an MCU, even one with a high clock rate, generally involves using interrupts that trigger constantly. At high frequencies, this leads to having very few, if any, clock cycles available for the processor to do anything else. Even counting a signal that's a couple hundred KHz on an ATMEGA running at 16 MHz results in horribly sluggish performance when doing anything else. Of course, these solutions also rely on the base clock the MCU is running on, which are generally not selected for high precision. Furthermore, unless you're programming in assembly, you don't really know exactly how many clock cycles it takes to perform a calculation, even something as simple as reading a pin and incrementing an integer, so you have no idea how the code itself is affecting the accuracy of your measurement. This circuit eliminates everything to do with the processor (with the exception of silicon bugs in the calculations) as a source of error.
In addition to the above error sources, most of these designs use a divider circuit to lower the frequency of the incoming signal. If you're dividing the frequency by 4, then you reduce your maximum precision accordingly.