The holiday weekend was full of moments of success and frustration. I refined the program to produce accurate numbers and began testing how consistent the new timer was, only to find it wasn't at all. Concerning the interrupt used to count how many times the overflow has occurred, there were zero problems. However, trying to get a consistent number from the counter proved to be troublesome.
It's obvious in hindsight, but when you're trying to count per the tick of the CPU clock, any other interrupts that are running can really screw with your results. Typically I was able to pull reasonable numbers on the microsecond scale of slightly better resolution than micros(), but variance at the nanosecond level was completely unacceptable. It was bad enough that using the micros() function actually produced marginally lower standard deviation... ouch.
Ultimately, disabling global interrupts would produce dead steady numbers from the Timer 1 counter. While I can work without interrupts, that also means the overflow interrupt is no longer enabled and I would have to "manually" watch the counter overflow. Sounded like a big waste of resources to me.
So ultimately I have two options to sort through now:
Option A: Pinpoint the offending interrupt(s) and disable it(them) temporarily while still allowing global interrupts.
Option B: Use a combination of Timer 1 and Timer 2 to count nanoseconds without the use of an interrupt. It dawned on me late last night that I can configure Timer 1 at a prescaler of 256 and be able to count up to a full second before it overflows at a resolution of 16 us. I can then set Timer 2 at a prescaler of 1 and be able to count from 62.5 ns up to 16 us before it overflows. Between the two of them I can get accurate timing up to a full second at a resolution of 62.5 ns without the use of interrupts.
I'd still be disabling global interrupts, but based on the communication requirements of the Nordic transmitters, I don't believe that would make too large of a hurdle.
Edit: The Trinket Pro and transceivers are now ordered. Hopefully next week I'll be able to get some hardware brought together. I've also attached screenshots illustrating the effect interrupts have. Both are using the Option B timer scheme which still has some odd variance in the low resolution counter.
Edit 12-2-2014: After reviewing the datasheet for the Atmega32u, it clearly points out that it may be necessary to sync the prescaler with the code... that'd explain the intermittent timing error of the low resolution counter that I was seeing. Using the GTCCR register, I can clear the prescaler at the start of my timing event to sync it up. Otherwise the prescaler may trigger right at the start of the timing event, or a while later leading to the inconsistent behavior.
The only downside is that syncing the prescaler affects all three timers tied to it so it would introduce errant behavior into the millis()/micros() functions, but as long as I'm aware of that when writing the program, it shouldn't be a problem.