^{1}

^{*}

^{2}

^{1}

^{1}

This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function at each time instant taken independently; (2) partial spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of each frequency value taken independently; (3) complete instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function of the entire CWD; (4) complete spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of the entire CWD. These indexes were tested on synthetic time series with different behavior (periodic, chaotic and random) and on a dataset of electroencephalographic (EEG) signals recorded in different states (eyes-open, eyes-closed, ictal and non-ictal activity). The results have shown that the values of these indexes tend to decrease, with different proportion, when the behavior of the synthetic signals evolved from chaos or randomness to periodicity. Statistical differences (

Since the works of Kotelnikov and Shannon [

The classical Shannon entropy measures the average information provided by a set of events and proves its uncertainty. This measure is shown as a natural candidate for quantifying the complexity of a signal. Also the level of chaoticity may be measured using entropy; therefore higher entropy represents higher uncertainty and a more irregular behavior of the signal. Moreover, if noise is added to an ordered signal the uncertainty increases and the entropy is also increased. Entropy can even explain how linked complex systems interact and exchange information. The quantification of the magnitude of this information becomes a goal in the study of biological signals.

The entropy estimation consists in calculating the probability of events that occur in time signals and in obtaining a reliable average value of the information provided by each of these events. The evolution of the entropy of a signal with respect to time, calculated from the instantaneous information of a window that slides over the signal, is a smoothing of the sequence of instantaneous information because the entropy is the average value of the information in this window.

The purpose of this paper is both to avoid this low-pass filtering inherent in the calculation of the entropy and to obtain instantaneous values of this measure. The time-frequency representation (TFR) technique is suited to achieve both aims. TFR generalizes the concept of the time and frequency domains to a joint time—frequency function that indicates how the frequency content of a signal changes over time [

In this paper, we investigate new measures that quantify the complexity and information content of a signal. Selecting time and frequency functions that satisfy marginal properties, one can assume that the energy density of a signal in one instant (or instantaneous power of the signal) is given by the entropy associated with the frequency components of the signal at this time instant (or instantaneous entropy). By similar reasoning, an equivalent measure can be obtained in the frequency domain (spectral information entropy). Thus, a different way to calculate the information in the TFR by estimating a probability density function of a signal either in time or in frequency domain is proposed in this paper. In a similar study, Baraniuk [

The Choi-Williams distribution (CWD) [

These indexes are tested on synthetic time series that simulate signals in which different behaviors (periodic, chaotic and random) are combined and on a dataset of electroencephalographic (EEG) signals recorded in different states (eyes-open, eyes-closed, ictal and non-ictal activity). For this analysis, EEG signals are selected since they are generated by nonlinear deterministic processes with nonlinear coupling interactions between neuronal populations [

CWD (1) is obtained by convoluting the Wigner distribution (WD) (2) and the Choi-Williams exponential (3) [

The spectral power is defined as:

Choosing an adequate parameter _{c}_{c}

Entropy can express the mean of information that an event provides when it takes place, the uncertainty about the outcome of an event and the dispersion of the probabilities with which the events take place. Let _{1}, x_{2,} x_{3}…, x_{n}

where

The probability mass function (PMF) was defined for a time instant _{k}_{PMF}_{k}_{CWD}_{i}_{k}_{k}_{PMF}_{k}_{CWD}_{i}_{k}_{k},f_{k}

The two distributions, quantization-time _{PMF}_{PMF}_{k}_{k}_{k}_{k}_{k}_{k}_{PMF}_{CWD}_{i}(_{PMF}_{CWD}_{i}(

From this proposed methodology, new indexes were defined:

Partial instantaneous entropy:

Partial spectral information entropy:

Complete instantaneous entropy:

Complete spectral information entropy:

In order to study the performances of the

A periodic signal
_{i}_{i}_{s}_{F}_{F} ≤

A MIX process, used in previous studies [

The same MIX process of 2) using as

The same MIX process of (3) using as

All synthetic signals had a length of 200 s and a sampling frequency of 128 Hz. For each synthetic signal, mean (

A freely available EEG dataset was used for validation [

Partial and complete distribution quantization _{i}_{i}

Partial and complete distributions quantization

Similarly to the evolutions observed in

Quantization-time distributions of signal 4 are shown in

Comparing the

The fundamental assumption of nonlinear techniques is that EEG signal is generated by nonlinear deterministic processes with nonlinear coupling interactions between neuronal populations. Nonlinearity in the brain is introduced, even at the neuronal level [

As it can be seen in

Regarding the sets C, D and E of the EEG database, there are very significant differences between the values of

Certain differences are observed between the inter-ictal activity recorded from the epilogenetic zone (set C) and from the opposite brain hemisphere (set D) in the frequency domain (

In contrast to the traditional time domain information measures as the

It is well known that the closed eyes condition produces certain changes in the EEG. One of the most remarkable alterations is the rise in the power of the alpha rhythm [

As it can be seen in

Higher values of

A new approach to calculate TFR entropy has been presented and applied to simulated and real physiological time series. This approach is based on the definition of Shannon entropy applied to the probability mass function of the TFR in both time and frequency domain. In this way, the smoothing inherent in the calculation of the entropy is avoided and instantaneous values of this measure are obtained.

This methodology takes advantage of the property inherent to TFR that permits to deal with non-stationary signals together with the property of Shannon entropy that deals with chaoticity, complexity and randomness.

The results have shown that the values of the proposed measures tend to decrease, with different proportion, when the behaviors of the synthetic signals evolve from chaos or randomness to periodicity. Finally, this paper has demonstrated that they can be useful tools to quantify the different periodic, chaotic and random components in EEG signals.

This work was supported within the framework of the CICYT grant TEC2010-20886 from the Spanish Government and the Research Fellowship Grant FPU AP2009-0858 from the Spanish Government. CIBER of Bioengineering, Biomaterials and Nanomedicine is an initiative of ISCIII.

All authors collaborated and contributed extensively to the work presented in this paper. More specifically: Umberto Melia, Francesc Claria and Montserrat Vallverdu designed the methodology and wrote the paper; Umberto Melia carried out the development of the algorithms and the application of the entropy measures to real and simulated signals; Pere Caminal had the general overview of the work.

The authors have declared no conflicts of interest.

Signal 1: (

Signal 2: (

Signal 3: (

Signal 4: (

Set A (awake state with eyes open): (

Set B (awake state with eyes closed): (

Set C (non-ictal activity recorded from the epilogenetic zone): (

Set D (non-ictal activity recorded from opposed brain hemisphere to set C):

Set E (ictal activity): (

EEG signals: (

EEG signals: (

EEG signals: (

EEG signals:

A vs. B | 0.2360^{n.s.} |
0.1547^{n.s.} |
0.0059 | 0.00003 |

C vs. D | 0.9861^{n.s.} |
0.0147 | 0.1096^{n.s.} |
0.0399 |

C vs. E | <0.0005 | <0.0005 | <0.0005 | <0.0005 |

D vs. E | <0.0005 | <0.0005 | <0.0005 | <0.0005 |

n.s., non-statistical significant.

EEG signals:

| ||||||
---|---|---|---|---|---|---|

alfa band | beta band | delta band | alfa band | beta band | beta band | |

A vs. B | <0.0005 | <0.0005 | <0.0005 | <0.0005 | <0.0005 | <0.0005 |

C vs. D | 0.0706^{n.s.} |
0.0467 | 0.2328^{n.s.} |
0.9013^{n.s.} |
0.8214^{n.s.} |
0.0235 |

C vs. E | 0.9151^{n.s.} |
<0.0005 | <0.0005 | <0.0005 | <0.0005 | 0.9249^{n.s.} |

D vs. E | 0.1284^{n.s.} |
<0.0005 | <0.0005 | <0.0005 | <0.0005 | 0.0341 |

n.s., non-statistical significant.

EEG signals: sensitivity (

A vs. B | pInstEntr (alfa) | 69.0 | 90.3 | 0.896 |

cInstEntr (delta) | 67.6 | 90.3 | 0.905 | |

cSpInfEntr(beta) | 66.2 | 72.2 | 0.784 | |

C vs. E | cInstEntr | 76.8 | 82.8 | 0.881 |

cInstEntr (alfa) | 77.8 | 82.8 | 0.874 | |

cInstEntr (beta) | 73.7 | 63.6 | 0.758 | |

D vs. E | cInstEntr | 83.8 | 75.8 | 0.893 |

pInstEntr (beta) | 61.6 | 96.0 | 0.853 |