Techniques Review

We are going to consider only time series analysis in this section. There are reviews of "complexity measures" (Feldman and Crutchfield 1998, Daw, et.al. 2002). Kuusela, et.al. (2002) compare the results of ten different complexity measures on heart rate and blood pressure time series.

  • Information theory estimates of complexity
    In general the entropy of a discrete probability distribution is given by
    H = sumi(-pi*log(pi))
    where i indicates one of the discrete states and sumi means to sum over all the discrete states. The entropy is larger if each discrete state has about the same probability of occurence. If you use a base-2 log, then the entropy represents the number of bits of information which can sent by a signal consisting of one state. As an example, if you have 4 possible states of a signal, all with equal probability, then the entropy is H = -4*(1/4*log2(1/4)) = -4*(1/4*(-2)) = 2 bits. So sending one of the states (perhaps the one of letters A-D) would transmit 2 bits of information. Sleigh et.al. compare five measures of entropy used to classify EEG.
    • Approximate Entropy
      Pincus (1991) introduced Approximate Entropy as a complexity measure. Given groups of N points in a series, the approximate entropy is related to the probabilty that two sequences which are similar for N points remain similar at the next point. He used this technique to characterize three chaotic systems.
      • EEG
        Rezek and Roberts (1998) compare Fourier entropy (see below) and approximate entropy and conclude that they both can distunguish anesthesia depth (as judged by agent concentration). Deeper anesthesia means lower complexity. Approximate entropy may have been slightly more sensitive to changes than Fourier entropy. Bhattacharya (2000) used approximate entropy to characterize the EEG of pathological groups compared with healthy groups. The degree of linear complexity is significantly reduced for the in the seizure group for most of the electrodes, whereas a distinct discrimination between the maniac and healthy
        groups based on these nonlinear measures was not evident.
      • Respiration
        Burioka et.al. (2003) used approximate entropy to measure the complexity of EEG and respiratory motion diring different stages of sleep. They found that the entropy of the two signals was related and both decreased during deep sleep.
    • Sample entropy
      A modification of the Approximate Entropy introduced by Richman and Moorman (2000). They claim that approximate entropy as defined by Pincus is biased by including self-matchs. They also simplify the calculation somewhat. There is a matlab function available to compute the sample entropy. Authors are DK Lake (dlake@virginia.edu), JR Moorman and Cao Hanqing.
      • Neonatal heart
        Lake et.al. (2002) use sample entropy to predict neonatal sepsis from heart rate data. They suggest ways to evaluate the optimal setting for the parameters (length of sequence, error tolerance) used in sample entropy. They found that sepsis could be predicted by the sample entropy, but that much of the difference was related to spike-like outliers in the data.
    • Fourier entropy
      To compute the Fourier entropy, a power-spectral-density of a time series (or portion thereof) is computed. The PSD is normalized to produce a probability-like distribution, and the entropy calculated as shown above. Fourier entropy seems to be the baseline against which other methods are shown to be better, but quite often is the easiest compute if the data sets are large.
    • Wavelet entropy
      Rosso et.al. (2003) use wavelet transforms (as well as other measures) to characterize seizure states from EEG recordings. Jones et.al. (2002) used a 2D wavelet transform of 2D recordings of head surface potentials. The complexity measure was related to the number of wavelet components needed to fit the signal.
    • Renyi Entropy (Gonzalez, et.al. 2000)
      This scheme computes the...
Read more »