**Quantitative EEG - Nonproprietary**

**Recall from Module 2....**

Fourier Transforms

Approximate Entropy

These two blocks are equivalent because the red data

points lie directly on top of the blue data points.

**Nonproprietary methods for processing the EEG**

The X-axis changes from measuring time...

...to measuring frequency

Fourier Transformation

**- Nonproprietary Methods**

- Spectral Edge Frequency

- Compressed Spectral Array

- Approximate and Cross-Approximate Entropy

- Permutation Entropy

- Transfer Entropy

**- Review**

**Outline**

**Spectral Edge Frequency**

sedation

**- Nonproprietary Methods**

**- Spectral Edge Frequency**

**- Review**

**Outline**

**- Compressed Spectral Array**

- Approximate and Cross-Approximate

Entropy

- Permutation Entropy

- Transfer Entropy

- Nonproprietary Methods

- Spectral Edge Frequency

- Review

**Outline**

- Compressed Spectral Array

- Approximate and Cross-Approximate

Entropy

- Permutation Entropy

- Transfer Entropy

**- Nonproprietary Methods**

- Spectral Edge Frequency

**- Review**

**Outline**

- Compressed Spectral Array

- Approximate and Cross-Approximate

Entropy

**- Permutation Entropy**

- Transfer Entropy

- Nonproprietary Methods

- Spectral Edge Frequency

- Review

**Outline**

- Compressed Spectral Array

- Approximate and Cross-Approximate

Entropy

- Permutation Entropy

**- Transfer Entropy**

**Compressed Spectral Array**

The EEG's power spectrum is recorded every so often.

These power spectra from successive time epochs

are then displayed....

...with colored vertical lines showing trends in the values over time.

Let's take the power spectrum of an EEG

i.e. its Fourier transform.

The EEG power spectrum also gives us....

**Module 5A**

Suppose that you are examining a signal containing N data points (Here N = 21)

Describes the “randomness” within a single signal

Larger numbers indicate more “randomness”

This signal can be broken down into blocks, each containing m consecutive data points (Here m = 3)

block #1

block #19

block #3

block #2

There are N - m + 1 = 19 blocks in this signal.

Now, let's consider the first block of m data points

Again, here m = 3

Translate this block to the right by 1 data point

Decide whether the red block of data points is equivalent

to the underlying block of blue data points

Here, the red data points do not lie directly on top of the blue data points. Therefore these two blocks are not equivalent.

Repeat this procedure, comparing the first block of

data points to each of the remaining blocks until you

reach the end of the signal.

Let's keep going...

If you define a noise threshold r (length of the green line), then these two blocks are equivalent because the distance between each red-blue pair is less than r.

After reaching the end of the signal, calculate what

fraction of the blocks are equivalent to the first block.

This number is Cm=31 .

In more disordered signals, fewer blocks are equivalent.

Here, 4 of the blocks are equivalent to the first block.

Thus, Cm=31. = 0.21.

Repeat the process using the second block of m data points.

You will end up with N – m + 1 = 19 values of Cm=3 , one for each of the 19 blocks of m = 3 data points.

Increase the block size m by 1 and repeat. The difference between the average values of lnCm and lnCm+1 is the approximate entropy of the signal.

Then repeat using the third block, and so on until you reach

the end of the signal.

Cross-Approximate Entropy

Describes the asynchrony between two signals

e.g. two EEG leads at different locations

Larger numbers indicate more asynchrony

Algorithm is similar to that for approximate entropy

Start with the first block of m points

Again, here m = 3

Compare the red block of data points to the first block of data points in the green signal.

Step through the green signal, comparing the

red block of data points with each blook in

the green signal.

Compute the fraction of blocks in the green signal that are equivalent to the red block. This number is Cm=1 .

Repeat the process using the second block from the blue signal...

...then the third block, and so on until you reach the end

of the blue signal

Increase the block size m by 1 and repeat. The difference between the average values of lnCm and lnCm+1 is the cross-approximate entropy of the signal.

Motifs 1 and 6 represent peaks.

Motifs 3 and 4 represent troughs.

Motifs 2 and 5 represent slopes.

Permutation Entropy

Describes the relative occurence of six three-point motifs in a given signal.

Signals with large amounts of high-frequency power contain all six motifs in roughly equal amounts.

Signals with mostly low-frequency power contain mostly motifs 2 and 5.

Because low frequency means fewer peaks and troughs.

To analyze a signal...

... break it into blocks of 3 data points, and identify the motif to which each block corresponds.

block #1

block #2

block #3

block #19

Determine the probability (pi, i = 1-6) of each motif being found in the signal.

The permutation entropy is then given by the equation

PE ranges from 0 to 1

PE = 0

Only one motif present

Not seen in real EEGs

PE ≈ 0.4

Effective minimum PE seen in real EEGs

Seen if EEG has mostly low frequency content

PE = 1

All motifs are present in equal numbers

PE approaches 1 as high frequency content increases

Transfer Entropy

Describes the amount of information "transferred" from one signal to another, related signal.

e.g. two EEG leads at different locations.

It is therefore a measure of how one signal's values affect the values of a second signal.

More precisely, it is the excess number of bits needed to predict the next value of one signal if you assume the two signals are independent instead of acknowledging that they are related.

Transfer entropy has been used to correctly identify pathways of electrical activity flow in the brain and to locate the focal zones of epileptic seizures.

Consider two signals A and B such that the next value of A depends on the previous k values of A and the previous m values of B

The transfer entropy is given by:

Let's illustrate this using a card analogy.

Suppose that two people each draw one card, simultaneously, from a fair deck of 52 playing cards

Player A tells you the suit of his card (A1)

Player B tells you the color of her card (B1)

Player B discards her card.

Player A gets a second card and then discards his first card.

But player A does not draw cards independently

of player B. In fact, here is what happens:

Predict the suit of player A’s second card (A2).

Player B gives her first card to player A

Player A discards his first card to a random spot in the deck.

Player B draws a second card from the deck.

The summation in the TE formula cycles through all possible

values of A1 and A2 (suit of player A’s first and second cards)

and B1 (color of player B’s first card).

This is because, since A2 is the same card as B1,

you're now predicting the suit of a card whose

color you already know.

This information changes your prediction of the

suit of player A’s second card (A2):

Entering these probabilities into the equation below yields a transfer entropy of 52/51 bits (about 1 bit).

This makes sense, as the information gained from player B was one binary piece of information (red or black).

(The unit is bits because the base of the logarithm is 2)

TBA is always a non-negative number.

A value of zero indicates no information transfer.

A positive value means B influences A

A value of zero means A and B are independent

A negative value means A influences B

To determine the direction of net information flow, find the net transfer entropy ( TBA - TAB )

When applied to two EEG leads in locations A and B,

a positive ( TBA - TAB ) means that electrical activity

at location B predicts that related electrical activity

will occur at location A.

There may be a pathway of cortical activation from location B to location A.

If the patient is having a seizure, then location B may be the origin (or at least is recruited before location A).

Ok, I get this deck of cards stuff.

But how does this apply to EEGs?

**References**

SPECTRAL EDGE FREQUENCY AND COMPRESSED SPECTRAL ARRAY

Burgess R, Cant BR, Hume AL, Priestley LC, Shaw NA. Cerebral monitoring by compressed

spectral array. N Z Med J 1977;86:521-523 (PubMed ID 272570: )

Rampil IJ, Sasse FJ, Smith NT, Hoff BH, Flemming DC. Spectral edge frequency - A new correlate of anesthetic depth. Anesth 1980;53:S12

APPROXIMATE AND CROSS-APPROXIMATE ENTROPY

Pincus S, Singer BH. Randomness and degrees of irregularity. Proc Natl Acad Sci USA 1996;93:2083-2088 (PubMed ID 11607637: )

Pincus SM, Mulligan T, Iranmanesh A, Gheorghiu S, Godschalk M, Veldhuis JD. Older males secrete luteinizing hormone and testosterone more irregularly, and jointly more asynchronously, than younger males. Proc Natl Acad Sci USA 1996;93:14100-14105 (PubMed ID 8943067: )

Hudetz AG. Effect of volatile anesthetics on interhemispheric EEG cross-approximate entropy in the rat. Brain Research 2002;954:123-131 (PubMed ID 12393240: )

PERMUTATION ENTROPY

Olofsen E, Sleigh JW, Dahan A. Permutation entropy of the electroencephalogram: a measure of anaesthetic drug effect. Br J Anaesth 2008;101:810-21 (PubMed ID 18852113: )

TRANSFER ENTROPY

Schreiber T. Measuring information transfer. Phys Rev Lett 2000;85:461-464 (PubMed ID 10991308: )

Sabesan S, Good LB, Tsakalis KS, Spanias A, Treiman DM, Iasemidis LD. Information flow and application to epileptogenic focus localization from intracranial EEG. IEEE Trans Neural Syst Rehabil Eng 2009;17:244-253 (PubMed ID 19497831: )

http://www.ncbi.nlm.nih.gov/pubmed?term=272570

http://www.ncbi.nlm.nih.gov/pubmed?term=11607637

http://www.ncbi.nlm.nih.gov/pubmed?term=8943067

http://www.ncbi.nlm.nih.gov/pubmed?term=12393240

http://www.ncbi.nlm.nih.gov/pubmed?term=18852113

http://www.ncbi.nlm.nih.gov/pubmed?term=10991308

http://www.ncbi.nlm.nih.gov/pubmed?term=19497831

**Content and Lecture by: Brad Fritz**

Prezi by: Gugua Okafor and Ting Zhang

Prezi by: Gugua Okafor and Ting Zhang