Eye-tracking development for Pavlovia

Hello all,

I have added some additional functionality to Thomas’s demo in this experiment
https://run.pavlovia.org/saketh/pavlovia-eye-tracking-heatmaps/
that is, to draw heatmaps using simpleheat.js using the position from webgazer.

You can checkout the code/repo here: https://gitlab.pavlovia.org/saketh/pavlovia-eye-tracking-heatmaps

Thanks!


Saketh.

2 Likes

Hi @thomas_pronk,

I was wondering if there was a specific reason that you chose version 2.01 instead of the latest version of 2.10 of webgazer? Coincidentally, I saw on the jsPsych website that they also don’t support version 2.10 of webgazer. It looks like version 2.10 includes some Tensorflow stuff, which is supposed to increase accuracy, I think. But does this makes it difficult to work with psychopy? I’m happy to try to incorporate version 2.10 into Psychopy myself, but I wonder if you foresee any roadblocks.

Thanks,
Han

Hi! No reason, it’s just that during my PsychoPy career it wasn’t out yet. Since only the minor version number went up, I assume the API is the same, so it should work out of the box. Maybe a little bit of tweaking for that feature I added in my webgazer fork, though I think the team was planning to add this functionality anyways. Hope it goes smoothly!

1 Like

Hi There,

To anyone following this thread the version of @thomas_pronk demo that is maintained by the team has now been updated to function in PsychoPy version 2021.2.3. There were some changes between version 2021.1.4 and 2021.2 which may mean the original version will not work in later releases (downloadResources changed to prepareResources)

You can find the updated demo here: demos / demo_eye_tracking2 · GitLab

Becca

2 Likes

Hi @thomas_pronk,

In your demo you remove the mouse listeners for the tracking trial with:
window.webgazer.removeMouseEventListeners();

However, if an experiment is click based (fixation crosses, items, etc.), would keeping the mouse listeners help with accuracy over time? In other words, do you think there’s a problem with keeping the listeners “on” throughout the experiment?

Thanks!

Hey! I remove them so that successive clicks aren’t used to calibrate the eye tracker anymore. Regardless of that you can keep on using them in PsychoJS to register clicks.

I think you need to be careful about keeping the mouse event listeners because webgazer seems to assume that the user always follows the cursor and uses cursor movement to generate eye predictions. See this discussion: Updating regression on mouse move should be optional · Issue #39 · brownhci/WebGazer · GitHub

1 Like

Thanks! Another question if you don’t mind:

I noticed your demo uses averagingWindow to smooth out the motion of the tracking square by taking an average of the last n gazes. If I’m not interested in averaging it out, could I just set averagingWindow to 1? Or should I directly save the raw gazes using window.xGaxes and window.yGazes.

In brief, I’m looking for the best way to save gaze predictions during a task and what specific variable to use for that.

Thanks!

I guess it should just work with an averagingWindow of 1, but I never tried it out. I’d recommend just directly saving the raw gazes.

Hi @Becca,

There were some changes between version 2021.1.4 and 2021.2 which may mean the original version will not work in later releases (downloadResources changed to prepareResources)

I have found that downloadResources does not work anymore.
However, prepareResources does not work either.

Do you know any solution?

Thanks in advance,
Minho

How are you using prepare resources? and what is the error you get when trying to use it. could you share the code?

If you are looking to use it for eye tracking I’de suggest looking at the code in this post if you are looking to use it for other reasons here’s a demo I made recently I hope might help

Hi @Becca,

My question was about resource loading at arbitrary moment, and your demo really helped me a lot.

I really appreciate your contribution!!!

Many thanks,
Minho

1 Like

Hi, I’m starting a new experiment and was wondering if there has been any new developments with webgazer or the demos for versions 2022.2.2. I was able to use the demos successfully last year, but just want to make sure I’m not missing anything since then when starting up again with the newest PsychoPy version. Thans!

Hey - No I don’t think you’re missing anything! here are the links to the demos we use in workshops Advanced online — Workshops for PsychoPy 2022 2022

2 Likes

Hi, I am a PsychoPy newbie so I’m sorry if it is a dumb question: Is there any way to run eye direction detection and SSVEP simultaneously? I need to know if a user’s eye direction confirms the approximate location of an SSVEP target (among other SSEVPs object) on the monitor.

Thanks! I noticed in the demo (version 3), that if during calibration your head leaves the validation box, the webcam thumbnail reappears and putting your head back in the validation box does not make the webcam thumbnail disappear (which blocks potential calibration points). I know @thomas_pronk dealt with a similar issue in version 2, but I was wondering if you knew how to prevent this from happening during calibration…
I see version 3 has some additional lines in the calibration_code (22-28) that version 2 doesn’t have, but I haven’t been able to resolve this myself. Any thought…?

Thanks in advance!