Eye-tracking development for Pavlovia

I have not mentioned it in your prototype. This seems to work well. But the demo calibration of webgazer led to the same behavior. I opened an issue for the webgazer as it is not a pavlovia issue.

1 Like

I see theyā€™re already helping you out there. Good luck getting it ironed out!

1 Like

This is seriously one of the biggest advancements in online data collecting I have seen in the last months. Thank you so much Thomas!

1 Like

Cross-linking a thread with questions about my prototype. Off-screen/on-screen gaze with WebGazer-based experiment on PsychoPy/Pavlovia

Little update on ā€œprevent presenting calibration squares behind webcam thumbnailā€. I changed my mind about how to achieve this. Iā€™d like to hide the webcam thumbnail by default, only showing it when participantā€™s eyes are outside of the validation area. The validation area is that square you see within the thumbnail that turns red if you move your head too far.
I asked the webgazer team how I could best achieve this in this ticket

The updated demo still seems to result in calibration squares appearing in the upper left being unclickable. Have you been able to work around this? Otherwise, I may just need the entire video feedback to disappear. Thoughts?

Did you try version 2 (also in this thread)? That one hides the webcam thumbnail when the participantā€™s head is correctly positioned, thus working around this issue. Also has the calibration squares near the edges of the screen, regardless of the screenā€™s aspect ratio, which should make a slightly better and more reliable calibration.

O sorry you even mention you use the updated version. Iā€™ll take a look on Monday

Much appreciated. Iā€™m still quite wet behind the ears with coding properly. Iā€™m messing around with the calibration trials and looking into the coding equivalent of ā€œsend to backā€ for the video thumbnail. Iā€™ll keep messing around with it this weekend.

Found my mistake; for a workshop I once enabled the ā€œface overlayā€, but I forgot do disable it again. Because this overlay is on top of the calibration square, you canā€™t click it. I updated the demo. In your version, routine webcam_trial, code component start_webgazer, line 4, replaceā€¦
window.webgazer.params.showFaceOverlay = true;
byā€¦
window.webgazer.params.showFaceOverlay = false;

Hi @thomas_pronk, just to keep useful resources in one place, I was just pointed to this alternative JavaScript eye tracking library and it seems to work very well in their online demo.

Here is some sort of comparison against WebGazer:

1 Like

Thanks for the tip @Michael!

I took a look and Iā€™m not sure about GazeCloudAPI. Iā€™ve got two worries:

  1. Webgazer performs the eye-tracking on your computer, but GazeCloud sends the webcam video to their server for processing. Thatā€™s quite privacy-sensitive info. They donā€™t have a privacy policy on their website, nor can I look up who owns their domain name: https://www.whois.com/whois/gazerecorder.com
  2. There are not a lot of customization options.

In my searching around, I did find that one of the developers of GazeRecorder forked a repo for face detection. That one seems quite interesting (more than 11k stars). https://github.com/justadudewhohacks/face-api.js

Best, Thomas

1 Like

Thanks, I hadnā€™t looked deeply into GazeCloud, other than trying out their (impressive) demo. Local processing will be an important consideration for many studies (and something that ethics committees would/should consider).

1 Like

@thomas_pronk
Hello,
I have tested the demo. However, there is no output of eye-tracking in the result file. Was there a problem? Can you show me which index shows the result of eye movement? I am looking forward to your reply.

Hi @Hsin-Yuan_Chen,

Yes, the demo doesn not log anything. The gaze coordinates converted to PsychoJS height coordinates can be found in the tracking_trial routine, tracking_code component, line 24 to 27:

    [
      x - psychoJS.window.size[0] / 2,
      -1 * (y - psychoJS.window.size[1] / 2)
    ]

Best, Thomas

1 Like

Excellent. It seems to be working perfectly now. Calibration certainly could use some finetuning but itā€™s quite accurate for my purposes. Thanks so much!

1 Like

Check out : https://medium.com/@williamwang15/integrating-gazecloudapi-a-high-accuracy-webcam-based-eye-tracking-solution-into-your-own-web-app-2d8513bb9865

Thanks for the tip @szydej! See my critique on gazecloud in an earlier post in this thread. Eye-tracking development for Pavlovia

You can also try https://github.com/szydej/GazeFlowAPI. It use Local processing.

I just noticed Iā€™m chatting with a developer of GazeFlowAPI. Cool :slight_smile:

So in this repo I see I need an AppKey for it to work. Peeking into the C# and HTML5 JavaScript folder, I donā€™t see any code that actually processes the video, but I do see sockets being set up for connecting to 127.0.0.1. Is there another repo that does the processing then?

With GazeFlowAPI you can access real-time gaze and head position data from GazePointer WebCam Eye-Tracker

How to use it:

  1. Install and start GazePointer (download: https://sourceforge.net/projects/gazepointer/)
  2. To get your AppKey register at https://gazeflow.epizy.com/GazeFlowAPI/register/ You can use default AppKey for testing.
  3. Connect to GazePointer and start receiving gaze data.