You can download the following datasets to test the toolbox or to go through the tutorial. Each .zip files contains EEG data from a single participant (in EEGLAB format) as well as the concurrently recorded raw eye tracking data (already converted to plain text format).
This short demo dataset contains one block of trials (120 trials) from an emotional face perception experiment, briefly reported in the online supplement to Dimigen et al. (2009). In this "traditonal" ERP experiment participants were instructed to keep fixating the screen center during all trials. Participants were presented with color portraits of happy, angry, or neutral faces. Following the presentation of a small fixation cross for 1000 ms, a single face stimulus (7.5 x 8.5°) was presented in the screen center for 1350 ms. The tasks was to classify the emotional expression of the face as quickly as possibly by pressing one of three manual response buttons.
EEG and EOG were recorded at a sampling rate of 500 Hz using BrainProducts amplifers. EEG and EOG data is here provided for a subset of 26 channels to keep file size small. Channels were referenced against the mean of all 26 electrodes (average reference). Eye movements and pupil diameter were recorded binocularly with an SMI iView X hi-speed eye tracker at 500 Hz sampling rate.
Download example dataset:
[dataset available upon request, please use datasets 2 and 3 for now]
This is a short demo dataset consisting of five experimental trials and only 25 of the original recording channels. In the experiment, a participant searched for a small target stimulus within greyscale pictures of natural scenes. The pictures (800x600 greyscale images) were presented in the center of a screen running at a 1024x768 pixel resolution. A search target (a slowly expanding dark-gray disc) appeared between 8 and 16 seconds after picture onset at a random location within the picture. Participants pressed a button as soon as they detected the target. 1000 ms after the button press, the picture was replaced by an empty dark screen. Because of the varying time needed to find the target, trials have a variable duration.
Eye movements were recorded binocularly with an SMI IView X tracker at 500 Hz. The EEG data, recorded with BrainProducts amplifiers, is referenced against the average of all recording channels. Offline, the EEG was band-pass filtered from 1-40 Hz.
Download example dataset here [8.3 MB]
This is a short demo dataset from a natural reading experiment. In the experiment, participant read lists of five words from left to right. Parafoveal words were masked using the Moving Window Paradigm. Their task was to report whether the list contained the name of an animal (e.g. tiger). Instead of shared triggers, special synchronization messages, marked by the keyword "MYKEYWORD" were used as synchronization events for the eye tracker. Only one event value, the number 3 was sent repeatedly to the ET during the recording, it serves both as start-event and end-event for synchronization. The toolbox picks its first occurence as the start-event and its last occurence as the end-event.
Eye movements were recorded binocularly with an Eyelink 1000 tracker at 1000 Hz. EEG data was recorded from 72 channels with Biosemi Active amplifiers at a rate of 512 Hz. Offline, the EEG was band-pass filtered from 0.1 to 100 Hz and converted to average reference.
Online detection of saccades/fixations/blinks was switched on. Eye movement events can be directly imported using EYE-EEG.
The EEG recording contains more trigger values in addition to the "3":
Download example dataset here [14.4 MB]