EEG Digitisation and Processing

In our contact with clients we see an amazing spread of interest in biofeedback across the fields of music, psychology, sport, business, rehabilitation and much more.  The technology available now to apply biofeedback or neurofeedback has never been easier to use, but nevertheless it is still important to know about the principles of instrumentation and measurement if interpretation errors are to be avoided.  It is understandable that non-engineers wish to focus just on their particular application but quality research and reliable applications demand attention to some of the fundamentals of signal processing.  EEG - whether for biofeedback or to quantify brain function in some way is of interest to many so lets take a closer look.

The EEG is a biolectric potential that is recorded from the surface of the head using appropriate electrodes and instrumentation. The first recording of the EEG signal took place back in 1929.  Obviously they could not benefit from the modern hardware and processing capability we have now, so it is impressive that these very small signals - of the order of microvolts - could be measured with any fidelity.  Within ten years the characteristic components of the EEG had been described as the delta, theta, alpha and beta frequency bands that are often used today.  But what does this actually mean?

Fourier series idea

Imagine the challenge of describing the EEG signal elegantly

Inspecting the recorded raw EEG signal back in 1929 we would have probably seen this recorded on a chart showing a complex, varying signal plotted against time. 

Suppose for a moment we wanted to describe this signal in an unambiguous way. Imagine you were on the telephone to someone and tried to describe a particular EEG recording to them.  It would be a challenge to do this with any degree of accuracy.

Of course this type of problem is not new and mathematicians a long time ago got to grips with it.  They werent thinking of EEG in particular but just about the properties of anything that varied over time - it could be crop yields, temperature changes in the arctic or stock market prices - the approach would be largely the same.

In the 1800's a French mathemetician called Fourier worked out how to more accurately describe patterns of change (specifically those where patterns repeat in a cyclic fashion) in a way that was both concise and elegant. 

He used what we call "basis functions" to describe what we don't know (the original waveform) in terms of what we do know.  Fourier used sine waves and cosine waves as basis functions along with a method of choosing them. Any sine wave or cosine wave can be completely known if we know it's magnitude and frequency so these are very convenient basis functions.  Fourier's method aimed to choose these basis functions - producing a series of sine waves or cosine waves of different magnitudes and frequencies which if added together would produce an approximation to the original signal.  The more terms included in the Fourier series, the better the approximation of course.

Some authors have described the Fourier series approach as like using a prism to split a beam of light into a spectrum of colours.

Today we tend to use digital computers for signal processing rather than charts. Computers can't handle things that are continuous in time.  Even the shortest duration of data could not be represented in a digital computer unless it was sampled first.  In fact data acquisition systems such as the NeXus series sample continuous in time signals (via the sample rate) and they "sample" in magnitude (by the number of bits of resolution). 

When setting up for data acquisition with BioTrace and the NeXus devices some of the settings such as sample rate and signal resolution will have default settings that are appropriate to the situation.  With a click of a button you can calculate an FFT (Fast Fourier Transform) which is a digital implementation of Fourier's method. This lets you imagine the original waveform split into "frequencies" which can be classified and refined in many ways.

This is a lot of power and capability but it doesn't remove the need to grasp the underlying meaning of what is happening technically. In other articles we will delve into various aspects of signal processing basics.