Is there a memory-efficient way to compress/average/combine time series data in a non-linear way?

Simon Merrett wrote 04/15/2020 at 15:51 0 points

A mouthful of a .stack title - sorry about that. 

I'm recording temperature and I'd like to show the most recent readings in reasonably high resolution but also be able to see readings taken a long time ago in much lower resolution, for trend spotting etc. Perhaps a semi-log graph would suit this. 

Does anyone have a good way to calculate the plot values while storing as few variables in an array as possible? The principle being recent readings are shown with high time resolution and as the age of a reading increases, the resolution along the time axis decreases (just like a log scale, but it doesn't have to be a log).

For example, the first pixel would show the current second's reading (t0), the second pixel would show the mean value over the last 5 seconds (t1-t5), the third pixel would show the mean reading of the previous 10 seconds (t6 - t15). The problem I see is that once I calculate means for longer periods, the type of scaling from say 1 second to 12 hours over 64 pixels/points means there isn't a neat way to shunt values into the next column over. So it looks like I need to store the underlying data at a higher resolution (thus more memory) than I would be presenting it at.

There must be a time series compression approach that works for this. It could use bit twiddling, it could use a similar approach to an R-2R ladder DAC.  But has anyone got any tips or links to similar problems and solution approaches?

Thank you.