Extracting eye-data at 1000 Hz

OS (Win10):
PsychoPy version (2021.2.3):
Standard Standalone Installation? (y)
Do you want it to also run online? (n)

What are you trying to achieve?:

Hi there, I’m doing an Eyetracking experiment with an Eyelink 1000 and iohub. The settings are set in a .yaml file (see below) that I run in a code component (begin Experiment tab). After each trial, I’d like to extract the data of the entire trial duration, at 1000Hz (to compute a saccadic latency and adapt the task across trials).

What did you try to make it work?:

At first, I had a code component with the following:

begin routine:

thisTrialSamples =

each frame:

thisTrialSamples += [eyetracker.getPosition()]

And this worked well, until I realised that the ‘each frame’ is actually 60Hz (screen FPS), so instead of accumulating eye data at 1000Hz, I had it at 60Hz.

I tried other things like ‘eyetracker.getEvents(), but it only returns 1024 items while I was expecting 3000 (I am extracting the data during 3 sec.

Link to the most relevant existing thread you have found:

This kinda looks like this getLastGazePosition() only gives new gaze position every 15 ms - #2 by Corios in which the solution was to switch from iohub to Pylink, but i’d like to stay with iohub (and preferably with the builder)

This one also had a solution, but with the coder How do you track pupil size live on iohub?

What specifically went wrong when you tried that?:

The ‘each frame’ component worked well, but is just not suited to my needs of having a 1000Hz sampling rate.

The eyetracker.getEvents() returned samples of type ‘MonocularEyeSampleEventNT’, containing all the data needed (time, x, y), but only for about 1sec of recording (I checked the time) instead of 3.

the yaml file I am using:

data_store:
enable: true
experiment_info:
code: ystart
session_info:
code: S0001
monitor_devices:

  • Display:
    default_eye_distance:
    surface_center: 570
    unit_type: mm
    device_number: 0
    name: display
    physical_dimensions:
    height: 340
    unit_type: mm
    width: 590
    psychopy_monitor_name: default
    reporting_unit_type: pix
  • Keyboard:
    name: keyboard
  • Mouse:
    name: mouse
  • Experiment:
    name: experimentRuntime
  • eyetracker.hw.sr_research.eyelink.EyeTracker:
    calibration:
    auto_pace: true
    pacing_speed: 1.5
    screen_background_color:
    • 128

    • 128

    • 128
      target_attributes:
      inner_color:

    • 0

    • 0

    • 0
      inner_diameter: 0
      outer_color:

    • 0

    • 0

    • 0
      outer_diameter: 30
      target_type: CIRCLE_TARGET
      type: FIVE_POINTS
      default_native_data_file_name: 0031639
      device_timer:
      interval: 0.001
      enable_interface_without_connection: false
      model_name: EYELINK 1000 TOWER
      monitor_event_types:

    • MonocularEyeSampleEvent

    • BinocularEyeSampleEvent

    • FixationStartEvent

    • FixationEndEvent

    • SaccadeStartEvent

    • SaccadeEndEvent

    • BlinkStartEvent

    • BlinkEndEvent
      name: tracker
      network_settings: 100.1.1.1
      runtime_settings:
      sample_filtering:
      FILTER_ONLINE: FILTER_OFF
      sampling_rate: 1000
      track_eyes: RIGHT_EYE
      vog_settings:
      pupil_center_algorithm: CENTROID_FIT
      pupil_measure_types: PUPIL_AREA
      tracking_mode: PUPIL_CR_TRACKING
      save_events: true
      simulation_mode: false
      stream_events: true

Hello @Sylvain_Gerin

Your eye tracker is capable of sampling at 1000 Hz. However, your monitor only has a frame rate of 60 Hz. In your experiment, you have two devices: (1) your monitor, which runs at 60 Hz, and (2) your eye tracker, which runs at 1000 Hz. So, the monitor image can change every 16.667 ms while your eye tracker can register the eye movements every 1 ms independent of the the monitor frame rate. However, code in the ‘Each Frame’ tab is executed at your monitor’s frame rate, not at the sampling rate of the eye tracker. Thus, you register the eye tracking data every 16.667 ms.

The eye tracker is capable of running at 1000 Hz to prevent aliasing. Aliasing occurs when a continuous signal is sampled at a frequency too low to accurately represent the original signal. According to the Nyquist theorem, one should sample at least twice the maximum frequency present in the signal.

So, if you want to use code in the ‘Each Frame’ tab, you need a faster monitor. However, 1000 Hz monitors are hardly available. You can find some recommendations for monitors in Behavior Research Methods. You can certainly get consumer monitors running at 500 Hz. Bear in mind that stimulus colour and refresh rate seem to interact.

Best wishes Jens

1 Like

To expand on what @JensBoelte said.

Eyetrackers like this (which have high sampling rate, or just in general one could argue) are typically operated offline from your experiment program. Meaning that the actually recording of data is done on some other program. You could make this happen in psychopy, but that really depends on your coding ability.

If you look at how you are supposed to run eyelink with psychopy ( Communicating with an Eyetracker — PsychoPy v2025.2.2 ):

You actually need a second computer that runs the eyelink software, which is then connected to Psychopy. Psychopy will send a signal to other computer to start tracking (whenever you set it to) and then the non-psychopy computer handles that.

Also, when getting monitors with higher refresh rates (like 500 or 1000), you need to ensure that the monitor latency (how fast the computer can send a new screen to the monitor) and response time (how fast it takes for a pixel to change colours) are able to support 500hz.

Issac

1 Like