To refresh the context of the problem: RTC in my PyBoard Lite is ~2.4% slower than it should be based on comparisons with PC clock. While it may not look much it, for example, gives 23 min error over the period of about 15 hours. Given that I plan to deploy current meter for a month it's clear why there has to be some way to compensate.
RTC.calibrate() method doesn't allow to compensate that much. Maximum it can do is to slow down or speed it RTC for ~0.0488%.
I assume that this RTC lag is consistent in it's speed so I can do following:
- Before logging setup RTC to match my PC clock and capture that as t0
- Deploy meter and let it do the logging
- Dismount logger for data download
- Before downloading logs and while PyBoard is still running, connect to it and capture PyBoard RTC value and my PC clock value as t_actual and t_expected correspondingly
- Based on difference between expected and actual timestamps and duration since logging started calculate "speed up coefficient"
- Apply that coefficient to timestamps in my logs before processing them further
Math is pretty simple here. Let's assume that our RTC speed is some fraction of real (well working clock) speed. Than if we sync them at moment t_0 we get this: