ENTROPY IN DICHOTIC LISTENING EEG RECORDINGS

The electroencephalogram (EEG) reflects the electrical activity of the brain as recorded by placing several electrodes on the scalp. The EEG is widely used for diagnostic evaluation of various brain disorders such as determining the type and location of the activity observed during an epileptic seizure or for studying sleep disorders [1]. The dichotic listening (DL) paradigm is often used to assess brain asymmetries at the behavioral level. Dichotic listening means presenting two auditory stimuli simultaneously, one in each ear, and the standard experiment requires that the subject report which of the two stimuli was perceived best [2]. Entropy is a thermodynamic quantity describing the amount of disorder in the system. It can be viewed as a measure of uncertainty regarding the information content AbstractThe dichotic listening (DL) paradigm has an important role in brain asymmetry studies at the behavioral level. In dichotic listening, the subjects are alerted by diotic or dichotic stimuli which are meaningless, consonant vowel syllables on both ears. The subjects then presented the syllable they heard through a 6 button keypad. During this procedure, the EEG signals of the subjects were recorded by a 64 channel cap.


INTRODUCTION
The electroencephalogram (EEG) reflects the electrical activity of the brain as recorded by placing several electrodes on the scalp.The EEG is widely used for diagnostic evaluation of various brain disorders such as determining the type and location of the activity observed during an epileptic seizure or for studying sleep disorders [1].
The dichotic listening (DL) paradigm is often used to assess brain asymmetries at the behavioral level.Dichotic listening means presenting two auditory stimuli simultaneously, one in each ear, and the standard experiment requires that the subject report which of the two stimuli was perceived best [2].
Entropy is a thermodynamic quantity describing the amount of disorder in the system.It can be viewed as a measure of uncertainty regarding the information content Abstract-The dichotic listening (DL) paradigm has an important role in brain asymmetry studies at the behavioral level.In dichotic listening, the subjects are alerted by diotic or dichotic stimuli which are meaningless, consonant vowel syllables on both ears.The subjects then presented the syllable they heard through a 6 button keypad.During this procedure, the EEG signals of the subjects were recorded by a 64 channel cap.
Entropy is a measure of complexity or disorder in a signal.In other words, it is a measure of uncertainty of information in a statistical description of a system.The Shannon entropy gives a useful criterion for analyzing and comparing probability distribution, it provides a measure of the information of any distribution.of a system, which is often obtained from the distribution of probabilities distribution p = {p i } where p i is the probability of finding the system in the i th microstate [3].
In this study, EEG data recorded during dichotic listening paradigm from different subjects is analyzed in terms of entropy changes.Different entropy computations are compared and the results are presented.

ELECTROENCEPHALOGRAM (EEG)
The human brain is the most complex organic matter known to mankind and has, not surprisingly, been the subject of extended research.An early discovery established that the brain is associated with the generation of electrical activity.Richard Caton had demonstrated already in 1875 that electrical signals in the microvolt range can be recorded on the cerebral cortex of rabbits and dogs.Several years later, Hans Berger recorded for the first time electrical "brain waves" by attaching electrodes to the human scalp; these waves displayed a time-varying, oscillating behaviour that differed in shape from location to location on the scalp.The experiments conducted by Berger became the foundation of electroencephalography, later to become an important non-invasive clinical tool in better understanding the human brain and for diagnosing various functional brain disturbances [1].
Signals recorded from the scalp have, in general, amplitudes ranging from a few microvolts to approximately 100µV and a frequency content ranging from 0.5 to 30-40 Hz.Electroencephalographic rhythms are conventionally classified into five different frequency bands: Delta (0-4 Hz.), Theta (4-7 Hz.), Alpha (8-13 Hz), Beta (14-30 Hz.) and Gamma (30+ Hz) rhythms generally arise according to the state of the brain such as excited, relaxed, asleep or deeply asleep.The clinical EEG is commonly recorded using the International 10/20 system, which is a standardized system for electrode placement, and employs 21 electrodes attached to the surface of the scalp at locations defined by certain anatomical reference points; the numbers 10 and 20 are percentages signifying relative distances between different electrode locations on the skull perimeter [1] (see Figure 1).

DICHOTIC LISTENING
Dichotic listening has been used in hundreds of research and clinical reports related to language processing, emotional arousal, hypnosis and altered states of consciousness, stroke patients, psychiatric disorders and child disorders, including dyslexia and congenital hemiplegia.One frequently used method to study language asymmetry is dichotic listening.Because of its ability to distinguish which hemisphere processes-specific sounds, the use of dichotic listening has become widespread in studies of brain asymmetry [2], [4].
Dichotic listening means presenting two auditory stimuli simultaneously, one in each ear.The subject reports which of the two stimuli was perceived best.The test follows a typical sequence of events, in which a dichotic or diotic stimuli is presented followed by the subject reporting what they heard, usually out of a list of six syllables or two tones In the dichotic listening test of auditory laterality, consonant-vowel syllables, like ba, da, ga, pa, ta, ka are used.

ENTROPY
Entropy is a thermodynamic quantity describing the amount of disorder in the system.In information theory, entropy is a measure of the uncertainty associated with a random variable.It can be viewed as a measure of uncertainty regarding the information content of a system, which is often obtained from the distribution of probabilities distribution p = {p i } where p i is the probability of finding the system in the i th microstate.

  
The entropy is measured in bits for logarithm of base 2, nats for logarithm of base e or kits for logarithm of base 10 [3].Shannon entropy in time domain is a measure of signal or system uncertainty and gives a useful criterion for analyzing and comparing probability distribution, it provides a measure of the information of any distribution When based on spectrum entropy, Shannon entropy can be taken as a measure of signal or system complexity.
Entropy can be computed by using the relative energy values of the wavelet decompositions of an EEG signal.Wavelet analysis is used to decompose the EEG into standard clinical subbands.Entropy is then computed using these wavelet coefficients.The Wavelet Entropy helps segment periods of bursting in EEG signals [5].

APPLICATION
Data used in this study was obtained in the specific experiment made by DEU Department of Brain Biophysics laboratories.Data contains the unfiltered EEG recordings of a subject captured by 64 electrodes cap.
In the experiment, subject is given a dichotic stimulus at pseudo-random time.2170 ms later than the stimulus, a light indicator is lit to inform the subject to answer about what was heard.The answer keypad contains 6 buttons each assigned to declare vowel syllablesba, da, ga, ka, pa, ta.The subject presses the related button and again a pseudo-random time passes, second stimuli is delivered.36 different pairs of stimuli are applied twice to the subject [4].
During this procedure, EEG recordings are received from 64 electrodes of the subject.Continuous EEG activity was taken with a sampling rate of 1 kHz, filtered between 0.15 and 70 Hz.The signals are grouped in 3 sections according to the responding ear; Left Ear Advantage (LEA), Right Ear Advantage (REA) and Homonym (HOM).Each group is averaged and examined [-1500 -1500] time interval.Figure 2 shows an example of an average EEG signal on electrode CZ.In the study, 1500 milliseconds before the stimuli and 1500 milliseconds after the stimuli is taken into account.As seen in Figure 2, the stimuli time is t=0.The windows size is chosen as 100 milliseconds for all computations.The Shannon entropy values of each window is computed and the window is slided with step=1 milliseconds.EEG and entropy values are normalized by z-score normalization in order to present in the same graph with the real EEG signal [6].
In entropy computation, basically 3 methods are used.In the first method (Method #1), sum of squares of the values are summed for each window and the square of the value is proportioned to the total value to be used as the probability of the current measure [7].The resulting graphs are as follows.In the second method of entropy computation, modified computation scheme of entropy which takes signal spectrum characteristics into account is used.The probability is approximated by the difference of the spectrum component and the mean value.Figure 6, 7 and 8 presents the entropy values calculated by this second method (Method 2) [7].As the third method of entropy computation, frequencies of the EEG values are calculated.The microvolt values are rounded to the nearest hundredths (10E-2).Frequency/signal width is used as the probability value for each measurement.The results are in Figure 9, 10 and 11.Comparing the four algorithms, methods 1 and 2 are poorer than the other methods in locating great oscillations in the signals.Method 2 and 3 are quite powerful in explaining the signal in terms of shape and amplitude.These two methods are effective in describing the deviations from the mean value of the signal.Method 3 uses a discretization technique of a continuous signal into small pieces and entropy values found by this method fit better to the original signal.Methods 1, 2 and 3 are more sensitive to sudden amplitude changes in voltage.Method 4 computes smoother entropy presentations of the signal and less affected from the point amplitude values.Especially in methods 2 and 3, the original EEG signal can be reconstructed in shape because of the similarity to the signal.

CONCLUSION
In this study, EEG recordings measured on CZ electrode during dichotic listening paradigm are examined in terms of entropy values.Different entropy computation methods are applied on EEG data and these methods are compared with each other.It was observed that the signals in REA and LEA responses are more significant than the HOM responses.
Of the four methods mentioned in this paper, methods 3 and method 4 are found to be more useful comparing the first two methods in terms of defining the original EEG signal in terms of entropy values.
In wavelet decomposition of EEG signals, using different wavelets, various entropy values can be obtained.As a preliminary work, defining the best wavelet for the specified EEG signal may be a fruitful application.
This study is being experimented with the EEG signals in different electrodes and of different subjects.Effects of the ear advantage, effects of syllables and data mining applications on EEG data are in the context of the study.

Figure 1 -
Figure 1 -The International 10/20 system for recording of clinical EEGs

Figure 2 -
Figure 2 -Average of EEG signals for REA responses measured on CZ electrode.In the study, 1500 milliseconds before the stimuli and 1500 milliseconds after the stimuli is taken into account.As seen in Figure2, the stimuli time is t=0.The windows size is chosen as 100 milliseconds for all computations.The Shannon entropy values of each window is computed and the window is slided with step=1 milliseconds.EEG and entropy values are normalized by z-score normalization in order to present in the same graph with the real EEG signal[6].In entropy computation, basically 3 methods are used.In the first method (Method #1), sum of squares of the values are summed for each window and the square of the value is proportioned to the total value to be used as the probability of the current measure[7].The resulting graphs are as follows.

Figure 3 -
Figure 3 -Entropy of LEA in CZ electrode computed by Method #1.

Figure 4 -
Figure 4 -Entropy of REA in CZ electrode computed by Method #1.

Figure 5 -
Figure 5 -Entropy of HOM in CZ electrode computed by Method #1.In the second method of entropy computation, modified computation scheme of entropy which takes signal spectrum characteristics into account is used.The probability is approximated by the difference of the spectrum component and the mean value.Figure6, 7 and 8 presents the entropy values calculated by this second method (Method 2)[7].

Figure 6 -
Figure 6 -Entropy of LEA in CZ electrode computed by Method #2.

Figure 7 -
Figure 7 -Entropy of REA in CZ electrode computed by Method #2.

Figure 8 -
Figure 8 -Entropy of HOM in CZ electrode computed by Method #2.

Figure 9 -
Figure 9 -Entropy of LEA in CZ electrode computed by Method #3.

Figure 10 -
Figure 10 -Entropy of REA in CZ electrode computed by Method #3.

Figure 11 -
Figure 11 -Entropy of HOM in CZ electrode computed by Method #3.As a different entropy computation in signals, wavelet entropy is studied.In wavelet entropy, the signal is transformed into wavelet coefficients and wavelet decomposition of the EEG signal is obtained.The subband wavelet entropy is defined in terms of the relative wavelet energy of the wavelet coefficients.The probabilities of these coefficients are found by normalizing subband relative energies by the total energy.Using these values as probabilities, Shannon entropy values are computed for each window[8].The resulting graph is as follows.

Figure 12 -
Figure 12 -Entropy of LEA in CZ electrode computed by Wavelet Entropy method.