Eye-Tracking and Gaze-Contigent Paradigms

Hi everyone,

Does anyone have experience with coding gaze-contigent paradigms in PsychoPy?

In my particular case, I’m interested in working with the moving window paradigm, in which the characters of a sentence or text are crossed out except for a set few “closest” to wherever the reader is gazing - the text is thus unreadable except for a moving window that accompanies the reader’s gaze.

I’m working with an SMI eye-tracker, which I’ve succesfully made work with a couple of simple image-visualization experiments (which I’ve accomplished through iView’s SDK, not IOHub). The SMI SDK actually contains an example of a gaze-contigent experiment, but it’s rather simple in comparison to the moving window paradigm, which I feel might have a lot of technical and theoretical issues of its own.

Does anyone have any ideas or suggestions on where to start?

Thanks a lot!

In PsychoPy (outside Builder), the moving window task should be very easy to implement by using the aperture stimulus http://www.psychopy.org/api/visual/aperture.html

e.g. you create two images (or text stimuli): one with the actual text, the other with the mask (e.g. lots of XXX). On every frame, you draw the XXX mask, then position the aperture stimulus to the current gaze position, and then draw the text stimulus through it so that it erases the underlying mask just at that location.

Perhaps get this working first just using the mouse to control the aperture position to see if it works the way you want, and then switch to making the position gaze controlled. (Once the ioHub connection is made, achieving gaze control is just one line of code (setting the stimulus position to the current gaze position) but it is getting the connection configured that can be the tricky part).

That sounds, at the very least, like an excellent starting point. I can imagine a few issues that might come up with having gaze directly control the aperture position but I’ll think about those if/when they pop up (and I can successfully implement it). I’ll start messing around with the aperture stimulus, as you suggested, and see what I can do.

Thanks a lot!

Once you have the mouse version working, making it work with the eye-tracker should be relatively straightforward. I’m not sure how much info you want about how iohub connects to eye-trackers, but I gave a pretty complete rundown of an SMI setup I used in this thread: SMI Eyetracker iViewXAPi

Once you have the iohub server communicating successfully with the eye-tracker, you just need gpos = tracker.getLastGazePosition(), which will give you the most recent fixation in an [x,y] tuple (or nothing, if it doesn’t have a lock on their eyes), which should be able to cleanly replace the x,y coordinates you are using in a mouse-based version.

1 Like

Thanks for your input Jonathan! I actually had looked through that thread when I first started trying to connect the eye-tracker to Psychopy.

I actually managed to get it working, both with mouse and eye-tracking! I didn’t use IOHub though, but rather the code included in iView’s SDK. The one problem I encountered was with the movement of the window - since the eye’s never really resting in one place but rather always rapidly moving, directly using the gaze position to position the window resulted in a very jittery, rapidly-shaking window that was rather disturbing. I got around it by using not the gaze position coordinates directly, but rather by first using the gaze position to determine what character’s being looked at, and using the center of that character as the window position. Therefore the window only moves when the participant’s gaze moves between characters, but not when he’s fixating a single character.

The correspondence between gaze and window still feels off, but I’m still messing with that portion of the code. Hopefully I’lll figure that out next week!