Quantitative EEG and normative databases

EEG raw recordings are complex patters and cry out for further processing

Electroencephalography (EEG) reflecting the ebb and flow of electrical energy from the brain has been in and out of fashion over the years.  It's complexity has meant that increasing interest has usually coincided with advances in hardware and software for processing it.  So called Quantified EEG (QEEG) processes the EEG signals in order to derive improved meaning and refine assessment and treatment of conditions such as traumatic brain injury and stroke.  One approach which is become popular is to compare the EEG recorded from an individual with a database of norms. In this article we take look at the fundamentals of this approach

The origins of EEG
Electroencephalography (EEG) has it’s origins in the 1920’s when Hans Berger was able to record at the scalp the very small electrical signal from his son’s brain.  His quest was to find relationships between the electrical potentials he measured and various “mental events” including telepathy. Although he didn't succeed in a complete sense, he did establish the foundations of what has become a powerful and flexible technique for the analysis of cerebral cortical function.

Even in the earliest examinations of the EEG it was recognised that these electrical signals exhibit complex patterns that can vary with many aspects of the human condition and state of arousal;  for example,  eyes open versus eyes closed will exhibit changes in the apparent patterns of electrical activity.  

The "classical" frequency bands and how they relate to consciousness

Berger applied his knowledge of the Fourier Series (based on the original work of the famous French mathematician Fourier in the 19th Century) to the problem of how to describe an EEG signal more precisely.

The Fourier Transform breaksdown the original EEG pattern into a set of discrete frequency bands

 This approach allows the representation of complex, time varying, periodic signals as a set of much simpler “basis functions”.  These basis functions, sine waves and cosine waves, allowed Berger and everyone since to describe complex signals like the EEG as a set of sine waves for example of different magnitudes and frequencies.  If you imagine this set of sine waves added together again you would in effect be recreating the original signal.  Berger was able to theorise and analyse EEG signals in this way and started to describe the so-called frequency bands (alpha, beta, delta and so on) that are still referred to today.

Computer software techniques and hardware allow us today to rapidly calculate these frequencies and examine them in these “conventional” bands using a Fast Fourier Transform or FFT and then further process and display the data in many ways.  

These techniques have opened up many possibilities for analysis of the EEG which are still being explored and certainly we can’t cover them all here. 

A modern system such as those in the NeXus range from Mind Media can undertake so called Quantitative EEG (QEEG) or use these signals as part of a neurofeedback protocol with little fuss compared with even ten years ago.

Brain maps provide a different perspective on the EEG signal

The use of frequency bands, brain maps, coloured bars etc is very useful because of the fundamental difficulty of carrying out visual analysis and interpretation of an EEG time series chart.  For most of us mere mortals it is impossible and at best is very time consuming.  Computer analysis makes QEEG much more accessible.  However, even though the technology has made more possibilities for analysis available this does not ensure that we don’t have to think carefully about the meaning of all this stuff.

Brain mapping is a type of display that we can often see used within the context of QEEG and neurofeedback.  This is a topographic map generated from the results of a multi-channel EEG frequency analysis.  Some interpolation is necessary to generate these maps which start with signal values measured at each electrode site.  The interpolation “guesses” the values between electrode sites and typically assigns a colour range to a signal value.

Brain maps can represent different types of information such as voltage at an instant in time, frequency data, or a so called z-score of such time or frequency activity.

Normative databases have become common practice int he field of neurofeedback and there are certainly advantages as well as potential disadvantages.  The idea is to identify the normal range of variation in aspects of the EEG signals of a population of “normals” in order to examine whether a particular arbitrary subject belongs to this normal population or not.

This is potentially of value to clinical neuropsychologists in that databases can help to quantify the features of the EEG for normal populations and permit comparisons of population groups with different populations.  A potential problem arises when trying to assess an individual patient as normal.

Artefact in EEG due to tensing jaw muscles

The first difficulty relates to the need to have artefact-free data. Even small artefacts which might be easily spotted in conventional EEG recordings can be very misleading in computerised EEG analysis.   With modern systems the sources of errors have reduced but there is still significant potential for signal noise due to muscle contractions in the face and scalp, nearby electromagnetic sources and more.

Also patients cannot always be compared to a normal database even if the quality of the EEG record is good.  The subjects included in the database may differ from the patient in ways that may potentially influence results.  Some of the databases in common use are necessarily formed with subjects excluding any using drugs or medicines, who might have had head trauma or a significant medical problem for example.  

Some authors question the clinical significance of QEEG lying outside normal limits. An abnormality determined on the basis of populations statistics would still need to be confirmed as an actual, clinically meaningful, abnormality.  

The fundamental point is that there is the potential for confusion between what is a statistical anomaly and a true clinical abnormality.

There is a growing interest and utility in the use of normative databases but this should not eliminate the need to think more widely and link the science and the art of EEG analysis.  A clinician should still be involved to evaluate and correlate all the results in order to derive a clinical impression - to rely on a database alone may be misleading.

Reference
J Romano-Micha
“Databases or specific training protocols for neurotherapy? A proposal for a clinical approach to neurotherapy”
Journal of Neurotherapy
Investigations in Neuromodulation, Neurofeedback and Applied Neuroscience
Volume 7, 2003 - Issue 3-4

EEG, Research, NeurofeedbackDerek Jones