Sending trigger to EGI system: Ethernet latency

Hello everyone,

I’m currently using Psychopy to transmit triggers (events) to the EGI system (Net Amp 400) via an Ethernet cable (Cat 5e), following the company’s recommendations. I have two questions regarding this setup. Firstly, there are moments where Psychopy fails to display the stimulus properly, resulting in a delay, and consequently, the trigger is not sent to the EGI system, resulting in its loss. Why did this happen? Secondly, I’ve noticed that the timing data received from Psychopy consistently exhibits some degree of delay, which varies inconsistently (usually within a few milliseconds). Is this variability normal? I would appreciate any insights you may have on this matter.

Best regards,
Mohammadreza

Hello,

Firstly, there are moments where Psychopy fails to display the stimulus properly, resulting in a delay, and consequently, the trigger is not sent to the EGI system

Assuming you’re using the code from the main docs page (Sending triggers via EGI NetStation — PsychoPy v2024.1.1), if a stimulus isn’t displayed then conditions for sending the trigger are never met: if stimulus.status == STARTED and not triggerSent

Secondly, I’ve noticed that the timing data received from Psychopy consistently exhibits some degree of delay, which varies inconsistently (usually within a few milliseconds). Is this variability normal?

The “delay” or offset is caused by a difference between when PsychoPy draws the stimuli and when it sends the command, that delay should be fairly consistent. Using the example code from above, the deviations around that offset should be (mostly) within 3-5ms.

If you’re noticing deviations more than this or a linear drift over the course of your experiment then there’s some things to check:

  1. Are you using the most recent copy of the package (GitHub - nimh-sfim/egi-pynetstation: Python package to enable the use of a Python-based API for EGI's NetStation EEG amplifier interface.), also available to pip install.
  2. If you’re measuring the offset with the EGI timing kit, making sure that you’re getting as clear a signal as possible using either the gold plug audio cable or the photo-diode with brightness on the display turned up.
  3. Consider adding in an occasional resync() command. Using the example page from above, it would be to add a eci_client.resync() to the start of a routine. This instructs egi-pynetstation to query the amplifier for the current time and calculate a behind the scenes difference that will be used to compute stimulus onset times. *I’ll add this instruction to the official docs page next chance I get.

Feel free to post your code and/or the output from a timing test (MFF file preferred, but can make do with a screenshot of the Timing Tool output).

-Peter

For illustration on a task I recently collected. The offset is the difference between the photocell and the event trigger. The variability on that offset should be consistent, you can see here that 95% are within 1ms of the median and 98% are within 2ms. There’s one outlier offset.

When analyzing these data I would specify that offset to Net Station, MNE, FieldTrip, etc to recalculate the onset times to that median.

Hi Peter,
Thank you for your response,

Allow me to clarify my concern. I’m currently working on an ERP task where each event is expected to occur at a 1.2-second interval from the preceding one. However, upon receiving the events in the EGI system, I noticed a variance in the timing of event arrivals. As depicted in Figure 1, the recorded event times exhibit a mean of 1.2048 seconds with a standard deviation of 0.0909 seconds.

Furthermore, I have included the structure (without a response pad console and video projector) of my recording (reference: https://www.egi.com/images/stories/manuals/GES_400/core/GES_400MR_uman_8100401-55_20150722_hires.pdf).

Thank you.
MohammadReza
avg_egi

Hello,

Could be a lot of contributing factors. My first feeling is that refresh rate on the screen is influencing the mean.

What is the refresh rate on your monitor and projector? Are you collecting the photocell data from the projector? You could look at the photocell output to see if psychopy is causing the variance in presentation.

Happy to look at the psychopy script and MFF If you post them.

-PM