Is a 1D Kalman filter suitable to correct clock drift against an external, noisy, cyclical phenomenon?

Simon Merrett wrote 02/20/2020 at 20:21 0 points

Bear with. 

In typical fashion, I've been wanting to do a project for a while, HaD runs a competition for it and I miss the boat. But that's fine. I ran into an issue in a design for a remote, low power sensor device last year which needed to roughly know what time of day it is. However, it would have been beyond the UI, the user or the deployment regime to set the time on an RTC so I had to omit it. 

Since then, I've wondered if you could use periodic light readings from an LDR/fixed resistor voltage divider into an ADC to identify a daily period and keep the notoriously drifty watchdog oscillator on track and maintain an internal sense of time in the microcontroller that, over months, will only be out by roughly it's PPM * 1 day (plus a bit more for error in the light reading periodic matching). 

So far I've found a reasonable algorithm to identify the daily light reading periods which appears to work with full year of UK solar pv data that I found (mapped to fixed point values) as a proxy for light readings. I'm currently logging some real readings to test it on more representative data using the hardware described above. 

What I'd like to ask the stack is whether you think a one dimensional Kalman filter would be a good way to combine the expected/estimated duration of a day (based on the number of watchdog interrupt firings in the previous light period) with the measured duration of the day, based on the number of watchdog interrupt firings in the current light period. 

Planning to put this project on HaD.io soon, once I have a little more to show for it. 

Ps, the current day period measuring algorithm is _very_ lightweight and normalises 6hr and 12hr averages, dividing the minimum by the maximum. It then looks for a trough and measures the middle of the trough as a marker for the end of one day and beginning of the next. For a heavier processing burden, I may look into an autocorrelation routine, which should be more robust to some kinds of failure (like not dipping below the trough threshold for some reason). 

I could not find anything like this on the Web when I looked so if you have seen something similar before, please link to it. I would have thought this would be a helpful way to keep coarse time on microcontroller sensor nodes for very little cost, components and processor resources.