Showing stim. presentation on eye-tracker's record screen

Hi everyone,

I am using Eyelink 1000 Plus and Psychopy 1.84.2. In Eyelink demo (Track, i.e., the one potentially using Eyelink software) the stimuli is shown on the eye-tracking monitor and it is possible to see where the subjects look during the trial. I was wondering if it is possible to have a similar presentation in iohub.

Has anyone tried doing this?

Thanks for your comments,
Cheers,
Natalia

Hi,

I don’t have any experience with Eyelink, but isn’t the eye-trackers record screen just a normal screen connected to your computer? Are you having issues using it with psychopy?

It would help to know what you’ve tried so far.

Best,
Jan

Hi,

thanks for the questions, I will try to explain my idea.

The screen which I am talking about is not a screen connected to my presentation computer but the one connected to the (eye-tracker ) data acquisition computer. I can use iohub to get gaze positions from that computer, and I can send triggers to store together with eye-tracking data, however, I was wandering how (or rather if) I can send the screen that subjects see to data acquisition machine and display during data acquisition (now I only see gray screen and subject’s gaze during recording on the data acquisition computer). I know that there is (must be) a way to do it using Eye-link experiment builder software.

PS, the second option could be to create a second window (on a second monitor) on my presentation computer, duplicate whatever they see there and a gaze cursor. For the time being I am favoring the first way (if it is possible to do it this way) .

Thanks again,
N

This is really a question about the EyeLink API. You should consult the manuals for that to see how/if this can be done. This sort of thing is eye tracker dependent (e.g. I could tell you how to do it for an SMI system, but not EyeLink). Come back to us with what you find, if necessary.

Hi Michael,

Thanks for this comment and clarification!

Hi Natalia,

I was wondering if you were able to figure out how to get PsychoPy to draw the same stimulus on the host PC (data acquisition computer) as the subject sees on the experiment computer? If so, can you share the relevant lines of code or point me towards the right resources? I, unfortunately, haven’t found much support online.

Thanks so much!

Best,
Anuya

Hi Anuya,

I ended up not implementing that part as it was likely to have a delay in synchronization (and be useless) for the experiment I was working on at that time.
You could have a look at bitmapBackdrop() and bitmapSaveAndBackdrop() from pylink and more into Eyelink API as Michael suggested.

Best,
Natalia