| Reference | Downloads | Github

Eye-tracking development for Pavlovia

Hi all,

I was wondering whether there are any plans to develop eye-tracking for online experiments.

There is existing open-source web-based eye-tracking Javascript code: and I was wondering whether it would be straightforward to integrate this with psychoJS.



Hi Julia,

I happen to be playing around a bit with webgazer, and looking around on the forum, I saw your message. I wouldn’t call it straightforward. For context: I’m quite an experienced programmer. It took me about two hours to get a shabby prototype working. It will easily take me a day or more to have something nice. On the upside, I like to document things carefully, so once I got something, it could be a lot easier to for you to adopt. Don’t wait for it though, since the moment I need to do higher-priority stuff for the PsychoJS team, I’ll put the webgazer stuff on hold :).

On the Facebook-group PsychMAP, I’ve been chatting with someone about webgazer. I quote some info that could be useful below.

What I noticed: (1) you need to keep your head quite still, (2) it was very heavy on the CPU of my laptop (even though I’ve got quite a powerful one). (3) accuracy was good enough for something like an eye-tracking VPT, but not much more. Reliability is the main issue here; many of your participants won’t have sufficient discipline or sufficiently powerful equipment, so your data will be very noisy.

I looked at what Labvanced and Gorilla used. Labvanced is open-source, so I could establish they also use the webgazer library. Gorilla is closed-source (and their eye-tracking is in closed beta), so I cannot establish what tech they use. However, if I look at their reference documentation, it really looks like webgazer too. This means that on the level of technology, they are the same;. Where they could distinguish themselves is on how good their calibration tasks are (i.e. look here, now look there). When I got something ready, I’ll try to build it in such a way that you’ll have a lot of control over calibration. Does mean it will be a bit more work to set up, but given how iffy eye-tracking via web-cam is, that’s probably worth it.

Best, Thomas


Hi again! Here is a prototype of eye tracking via webgazer.


Hi Thomas,

Thank you very much that’s fabulous, I will have a look and pass it on to interested colleagues.

Also thanks for linking some additional info which will definitely be helpful for data collection. It comes at no surprise that data reliability may become an issue and that processing demands might mean that not all participants who are recruited will be able to complete a study. I had trouble in the past with a behavioural experiment without eye-tracking where the experiment froze/lagged for quite a few of my participants.


1 Like

You’re welcome! My Twitter post about this experiment also spawned some nice suggestions. I’ll add them to the README file. Update: done :slight_smile:

1 Like

Hi Thomas! Thank you so much for creating this. I was wondering if it would be possible to track the number of blinks per minute with this prototype.

Yes, I think it can. Basically, this prototype demonstrates how to integrate an existing eye tracking software library (webgazer) into PsychoJS. If webgazer can do it, then you can likely expand the prototype to do it.

To find out about blink detection, I did a search on the issues of webgazer’s GitHub repo. Here are the results:

Much appreciated Thomas! I remember getting Webgazer.js to initiate within a half-hour earlier this summer, but today I couldn’t get anything even after 2 hours of tweaking stuff.

Your calibration protocol also seems to be fairly effective. I tried going through it three times and after the best run, it seemed like the black square was consistently within an inch of my gaze target and it seemed to have less than 200 ms latency. Seems pretty good to me!

Thanks for your kind words pbog!

Challenge to implement
I can well imagine this was challenging for you to implement; since it was quite a challenge for me as well :). I don’t mention it in the README, but my main blocker was a conflict between the implementations of the “seedrandom” libraries used by webgazer and by PsychoJS. See the code component in the loading_trial routine for my workaround.

Nice! My bet is that calibration near the edges of the screen could be improved a bit. Just add some more trials to calibration_trials.xlsx for that.

Hi Thomas,

I was wondering if there are any updates on the prototype version you made a month ago? I just had a chance to look at your demo, but somehow I got stuck in the calibration phase and could not continue the experiment. At some point in the calibration, no more white square appears, just gray background with webcam recordings on the top left. I’d appreciate any advice or updates you can provide on this!


Hi Young,

No updates I’m afraid. I work as an RSE, and just built the eye tracking experiment as a tech demo. I’m hoping that a researcher that will actually use it in an experiment will take it to the next level.

About your issue. Here are two possibilities. It could be that one of the white squares is drawn outside of the area your monitor can display. It could also be something is going wrong after callibration. Try adjusting the conditions file; only present a single square in the center, for example. If you then get to the actual eye tracking, it’s my first explanation. If not, it’s probably something more complicated.

Best, Thomas

I had this issue when trying out this demo a couple of days ago (very cool, Thomas).

The issue is that some of the calibration points can be drawn behind the thumbnail of the webcam in the top-left corner, so you can’t see them to click on them. This depends on the size of the screen: it is an issue for me on my laptop screen, but not on a much larger monitor.

To get it working on my laptop, I just deleted the offending coordinates in the calibration_trials.xlsx file (i.e. the ones that had x coordinates of -0.66 or -0.44 and positive y coordinates of 0.22 or 0.44). That is quite a crude solution and will reduce the calibration accuracy of course, but I couldn’t figure out how to get the points to draw on top of the thumbnail.

Hi again @Michael and @young1,

I’ve now heard the issue of a calibration square being occluded by the webcam thumbnail quite a couple of times. I’ll release a fix for it. This could take a moment but not more than a week or two, since I’ll bundle it up with related development work.

This is what I’d like to achieve:

  • Must-have. Prevent presenting calibration squares behind webcam thumbnail. Ideally move them a bit to the right or bottom so that they are still as close in the top-left corner as possible, otherwise just not present them.
  • Could-have. Adjust the positioning of the calibration squares such that they are more evenly distributed over screens of different aspect ratios.

Best, Thomas

Little update on “prevent presenting calibration squares behind webcam thumbnail”. I changed my mind about how to achieve this. I’d like to hide the webcam thumbnail by default, only showing it when participant’s eyes are outside of the validation area. The validation area is that square you see within the thumbnail that turns red if you move your head too far.

I asked the webgazer team how I could best achieve this in this ticket.

1 Like

Hi all!
I am currently working on a plugin to integrate webgazer.js into jsPsych. It works well so far, but sometimes the tab in pavlovia crashes. Has anyone ever noticed a similar behavior?
Best, Tabea

1 Like

Hey @TabeaW, I haven’t noticed this in my PsychoJS integrations, but I did notice something else: on Windows 10 Firefox, I can get a warning of an unresponsive script. Do you get any messages in the browser console?

Update about eye tracking upgrades for PsychoJS (which you’re welcome to use in jsPsych @TabeaW:

  • Got some useful pointers from the webgazer team.
  • Managed to build a modified version of webgazer 2.0.1 that exposes the checkEyesInValidationBox function. Seems to work.
  • My CPU is a bit busy today running a big simulation. Once it’s done, I’ll go tweak the PsychoJS prototype some more

Unfortunately no, when the tab crashes, it clears the console (shortly before the crash I couldn’t recognize any warnings/errors)

Tricky. If I don’t get any useful error messages, I tend to debug using an almost experimental design; try different libraries, browsers, remove code, until I get something to work. @TabeaW, does my prototype also crash your browser? Here is a link to the runner:

Note I’ll b

I managed to implement the features I was aiming for. An improved version of the eye-tracking demo can be found at the link below.