Electroencephalography (EEG) data provides a non-invasive window into the brain’s electrical activity, capturing the collective firing of millions of neurons. This raw data is a complex mixture of true brain signals and various sources of noise, appearing initially as an overwhelming, chaotic stream of voltage fluctuations. The purpose of EEG analysis is to systematically convert these disorganized electrical signals into quantifiable metrics that reveal meaningful information about brain function. This transformation requires meticulous processing steps, moving from initial data cleaning to advanced mathematical decomposition, designed to filter out noise and isolate subtle, patterned electrical events.
Preparing the Raw Data for Analysis
Before scientific interpretation can begin, raw EEG recordings must undergo a rigorous cleaning process, as the true brain signal is often much weaker than the contaminating noise. These unwanted signals, known as artifacts, originate from both physiological sources (e.g., eye blinks or muscle contractions) and technical sources. Physiological artifacts, such as ocular artifacts or electromyography (EMG), can be ten times stronger than the brain activity being measured, effectively obscuring the neural data.
Filtering is one of the first and most direct steps, where different frequency ranges are targeted for removal or attenuation. A high-pass filter removes slow, non-neural drifts in the signal, while a low-pass filter removes high-frequency noise, such as muscle artifacts. Additionally, a specific notch filter is used to eliminate line noise, a technical artifact caused by alternating current, which manifests as a sharp spike at 50 or 60 Hz.
Beyond simple filtering, advanced techniques are employed to separate brain activity from artifacts. Independent Component Analysis (ICA) is a mathematical method that separates the complex recorded signal into its distinct underlying sources. This allows components representing eye blinks or heartbeat pulses to be identified and removed without significantly distorting the underlying brain activity. The final preparation step is re-referencing, which establishes a common electrical baseline for all recording electrodes, often by calculating a Common Average Reference (CAR) to improve the signal-to-noise ratio.
Analyzing Activity Locked to Specific Events
Once the data is clean, a common approach focuses on time-domain activity by isolating Event-Related Potentials (ERPs). ERPs are small voltage fluctuations in the EEG directly tied to the timing of a specific sensory, cognitive, or motor event. Since the brain’s response to a single event is tiny and buried in the ongoing background EEG, researchers employ a process of trial averaging.
Averaging EEG segments time-locked to hundreds of similar events cancels out random, non-event-related background noise, allowing the consistent ERP signal to emerge. The resulting ERP waveform is characterized by a series of positive (P) and negative (N) voltage peaks, labeled by polarity and approximate timing in milliseconds. For instance, the P300 component is a positive deflection appearing roughly 250 to 500 milliseconds after a relevant stimulus.
These components are analyzed by measuring their latency (the time delay between the stimulus and the peak) and their amplitude (the strength of the voltage change). Latency reflects the speed of information processing, while amplitude is interpreted as the degree of attentional resources allocated or the extent of information processing. For example, prolonged P300 latency and reduced amplitude are frequently observed in patients with cognitive impairment, indicating slower processing. The N400 is a negative peak sensitive to semantic processing, showing a larger amplitude when a word is unexpected.
Analyzing Rhythmic Brain Activity
In contrast to event-locked analysis, the frequency-domain approach, or spectral analysis, focuses on the brain’s continuous, oscillatory activity independent of a specific stimulus. The brain’s electrical activity is composed of waves oscillating at different speeds, categorized into frequency bands that correlate with states of consciousness and cognitive function. Spectral analysis methods, such as the Fast Fourier Transform (FFT), decompose the complex EEG signal into these fundamental rhythmic components.
These oscillations are grouped into frequency bands associated with different states of consciousness:
- Delta (0.5–4 Hz) is associated with deep sleep.
- Theta (4–7 Hz) is seen during drowsiness or internal mental operations.
- Alpha (8–13 Hz) is prominent during relaxed wakefulness.
- Beta (13–30 Hz) activity is associated with active concentration and motor preparation.
The primary metric extracted from this analysis is power, which represents the intensity or strength of the activity within a specific frequency band. Researchers calculate the Power Spectral Density (PSD), showing how power is distributed across frequencies, allowing for comparisons between conditions. For example, a shift toward increased Theta power and decreased Alpha power is often noted in conditions of cognitive decline.
Coherence and Connectivity
A further extension of rhythmic analysis is coherence, which measures the degree of linear synchronization between EEG signals recorded at two different electrode sites. Coherence values range from 0 to 1, indicating the degree of cooperation and information transmission between underlying brain regions. Analyzing changes in coherence provides insight into the functional connectivity and network dynamics of the brain.
Translating Data into Neurological Insights
The final stage of EEG analysis involves translating numerical metrics—such as ERP latencies, P300 amplitudes, or Alpha power values—into meaningful neurological and psychological insights. This requires applying statistical rigor to determine whether observed differences are reliable and not due to random chance. Statistical tests are used to compare these metrics across different experimental conditions, such as comparing Alpha power during rest versus during a task, or comparing P300 latency between control and patient groups.
A significant finding is validated by calculating an effect size, which provides a standardized measure of the magnitude of the difference between conditions, independent of the sample size. Effect sizes, like Cohen’s \(d_z\) for within-subject designs, help determine the practical importance of the finding and contribute to planning future studies with adequate statistical power. Advanced statistical models, such as hierarchical linear modeling (HLM) or multivariate analysis, are used to account for variability across individual trials and subjects.
Ultimately, these statistically validated findings draw conclusions about cognitive processes, neurological disorders, or the effectiveness of an intervention. For instance, a significant reduction in P300 amplitude may be interpreted as evidence of impaired attention allocation in a specific patient population. By consistently linking quantifiable metrics to theoretical constructs, EEG analysis moves from the raw electrical signal to a comprehensive understanding of human brain function.