Use image and video as eye-tracker calibration target

Happy new year, everyone!

My lab focuses on developmental research with infants and young children. In order to achieve good eye tracker calibration, we often use videos or images as calibration target, which attracts young participants’ attention. Although this function is widely supported in other platforms (e.g., PTB on MATLAB), PsychoPy does not support yet. For this reason, we are hesitated to fully switch to PsychoPy.

We hope this function can be added in future releases. To my knowledge, using image or video for calibration is natively supported in SDKs from EyeLink and Tobii. It shouldn’t be too complicated to incorporate the functions into iohub.

Thanks in advance!

Does this code help?

Also, I’ve written a Moving Cue demo Animation figure - - #11 by wakecarter a pulsating circle which I could easily change to an image or animation


But the issue is not how to create a calibration animation, it is how to communicate the animation with eye-tracker within PsychoPy. If PsychoPy couldn’t support this function, we will have to write our own scripts with APIs provided by eye trackers.

And thank you for bringing up the Tobii controller. Unfortunately, the eye tracker that we are using is EyeLink. I haven’t seen similar package available.


1 Like

Sorry, I’m not an expert at eye tracking equipment.

Is it possible to add an image with the same frame by frame coordinates as the fixation dot?

It is currently not possible to use images or videos as calibration target stim when using the psychopy.iohub common eye tracking interface. It is a good idea though, so we will add this to the list of possible future feature requests.

To use images or videos as your eyelink calibration stim I think you will need to use the pylink package directly and implement your own calibration graphics. The SR Research support site may have some examples of doing this.

thank you

1 Like