psychopy.org | Reference | Downloads | Github

Eye-tracking development for Pavlovia

Tricky. If I don’t get any useful error messages, I tend to debug using an almost experimental design; try different libraries, browsers, remove code, until I get something to work. @TabeaW, does my prototype also crash your browser? Here is a link to the runner: https://run.pavlovia.org/tpronk/demo_eye_tracking/

Note I’ll b

I managed to implement the features I was aiming for. An improved version of the eye-tracking demo can be found at the link below.

2 Likes

I have not mentioned it in your prototype. This seems to work well. But the demo calibration of webgazer led to the same behavior. I opened an issue for the webgazer as it is not a pavlovia issue.

1 Like

I see they’re already helping you out there. Good luck getting it ironed out!

1 Like

This is seriously one of the biggest advancements in online data collecting I have seen in the last months. Thank you so much Thomas!

1 Like

Cross-linking a thread with questions about my prototype. Off-screen/on-screen gaze with WebGazer-based experiment on PsychoPy/Pavlovia

Little update on “prevent presenting calibration squares behind webcam thumbnail”. I changed my mind about how to achieve this. I’d like to hide the webcam thumbnail by default, only showing it when participant’s eyes are outside of the validation area. The validation area is that square you see within the thumbnail that turns red if you move your head too far.
I asked the webgazer team how I could best achieve this in this ticket

The updated demo still seems to result in calibration squares appearing in the upper left being unclickable. Have you been able to work around this? Otherwise, I may just need the entire video feedback to disappear. Thoughts?

Did you try version 2 (also in this thread)? That one hides the webcam thumbnail when the participant’s head is correctly positioned, thus working around this issue. Also has the calibration squares near the edges of the screen, regardless of the screen’s aspect ratio, which should make a slightly better and more reliable calibration.

O sorry you even mention you use the updated version. I’ll take a look on Monday

Much appreciated. I’m still quite wet behind the ears with coding properly. I’m messing around with the calibration trials and looking into the coding equivalent of “send to back” for the video thumbnail. I’ll keep messing around with it this weekend.

Found my mistake; for a workshop I once enabled the “face overlay”, but I forgot do disable it again. Because this overlay is on top of the calibration square, you can’t click it. I updated the demo. In your version, routine webcam_trial, code component start_webgazer, line 4, replace…
window.webgazer.params.showFaceOverlay = true;
by…
window.webgazer.params.showFaceOverlay = false;

Hi @thomas_pronk, just to keep useful resources in one place, I was just pointed to this alternative JavaScript eye tracking library and it seems to work very well in their online demo.

Here is some sort of comparison against WebGazer:

1 Like

Thanks for the tip @Michael!

I took a look and I’m not sure about GazeCloudAPI. I’ve got two worries:

  1. Webgazer performs the eye-tracking on your computer, but GazeCloud sends the webcam video to their server for processing. That’s quite privacy-sensitive info. They don’t have a privacy policy on their website, nor can I look up who owns their domain name: https://www.whois.com/whois/gazerecorder.com
  2. There are not a lot of customization options.

In my searching around, I did find that one of the developers of GazeRecorder forked a repo for face detection. That one seems quite interesting (more than 11k stars). https://github.com/justadudewhohacks/face-api.js

Best, Thomas

1 Like

Thanks, I hadn’t looked deeply into GazeCloud, other than trying out their (impressive) demo. Local processing will be an important consideration for many studies (and something that ethics committees would/should consider).

1 Like

@thomas_pronk
Hello,
I have tested the demo. However, there is no output of eye-tracking in the result file. Was there a problem? Can you show me which index shows the result of eye movement? I am looking forward to your reply.

Hi @Hsin-Yuan_Chen,

Yes, the demo doesn not log anything. The gaze coordinates converted to PsychoJS height coordinates can be found in the tracking_trial routine, tracking_code component, line 24 to 27:

    [
      x - psychoJS.window.size[0] / 2,
      -1 * (y - psychoJS.window.size[1] / 2)
    ]

Best, Thomas

1 Like

Excellent. It seems to be working perfectly now. Calibration certainly could use some finetuning but it’s quite accurate for my purposes. Thanks so much!

1 Like

Check out : https://medium.com/@williamwang15/integrating-gazecloudapi-a-high-accuracy-webcam-based-eye-tracking-solution-into-your-own-web-app-2d8513bb9865

Thanks for the tip @szydej! See my critique on gazecloud in an earlier post in this thread. Eye-tracking development for Pavlovia

You can also try https://github.com/szydej/GazeFlowAPI. It use Local processing.