I have not mentioned it in your prototype. This seems to work well. But the demo calibration of webgazer led to the same behavior. I opened an issue for the webgazer as it is not a pavlovia issue.
I see theyāre already helping you out there. Good luck getting it ironed out!
This is seriously one of the biggest advancements in online data collecting I have seen in the last months. Thank you so much Thomas!
Cross-linking a thread with questions about my prototype. Off-screen/on-screen gaze with WebGazer-based experiment on PsychoPy/Pavlovia
Little update on āprevent presenting calibration squares behind webcam thumbnailā. I changed my mind about how to achieve this. Iād like to hide the webcam thumbnail by default, only showing it when participantās eyes are outside of the validation area. The validation area is that square you see within the thumbnail that turns red if you move your head too far.
I asked the webgazer team how I could best achieve this in this ticket
The updated demo still seems to result in calibration squares appearing in the upper left being unclickable. Have you been able to work around this? Otherwise, I may just need the entire video feedback to disappear. Thoughts?
Did you try version 2 (also in this thread)? That one hides the webcam thumbnail when the participantās head is correctly positioned, thus working around this issue. Also has the calibration squares near the edges of the screen, regardless of the screenās aspect ratio, which should make a slightly better and more reliable calibration.
O sorry you even mention you use the updated version. Iāll take a look on Monday
Much appreciated. Iām still quite wet behind the ears with coding properly. Iām messing around with the calibration trials and looking into the coding equivalent of āsend to backā for the video thumbnail. Iāll keep messing around with it this weekend.
Found my mistake; for a workshop I once enabled the āface overlayā, but I forgot do disable it again. Because this overlay is on top of the calibration square, you canāt click it. I updated the demo. In your version, routine webcam_trial
, code component start_webgazer
, line 4, replaceā¦
window.webgazer.params.showFaceOverlay = true;
byā¦
window.webgazer.params.showFaceOverlay = false;
Hi @thomas_pronk, just to keep useful resources in one place, I was just pointed to this alternative JavaScript eye tracking library and it seems to work very well in their online demo.
Here is some sort of comparison against WebGazer:
Thanks for the tip @Michael!
I took a look and Iām not sure about GazeCloudAPI. Iāve got two worries:
- Webgazer performs the eye-tracking on your computer, but GazeCloud sends the webcam video to their server for processing. Thatās quite privacy-sensitive info. They donāt have a privacy policy on their website, nor can I look up who owns their domain name: https://www.whois.com/whois/gazerecorder.com
- There are not a lot of customization options.
In my searching around, I did find that one of the developers of GazeRecorder forked a repo for face detection. That one seems quite interesting (more than 11k stars). https://github.com/justadudewhohacks/face-api.js
Best, Thomas
Thanks, I hadnāt looked deeply into GazeCloud, other than trying out their (impressive) demo. Local processing will be an important consideration for many studies (and something that ethics committees would/should consider).
@thomas_pronk
Hello,
I have tested the demo. However, there is no output of eye-tracking in the result file. Was there a problem? Can you show me which index shows the result of eye movement? I am looking forward to your reply.
Hi @Hsin-Yuan_Chen,
Yes, the demo doesn not log anything. The gaze coordinates converted to PsychoJS height coordinates can be found in the tracking_trial routine, tracking_code component, line 24 to 27:
[
x - psychoJS.window.size[0] / 2,
-1 * (y - psychoJS.window.size[1] / 2)
]
Best, Thomas
Excellent. It seems to be working perfectly now. Calibration certainly could use some finetuning but itās quite accurate for my purposes. Thanks so much!
Thanks for the tip @szydej! See my critique on gazecloud in an earlier post in this thread. Eye-tracking development for Pavlovia
You can also try https://github.com/szydej/GazeFlowAPI. It use Local processing.
I just noticed Iām chatting with a developer of GazeFlowAPI. Cool
So in this repo I see I need an AppKey for it to work. Peeking into the C# and HTML5 JavaScript folder, I donāt see any code that actually processes the video, but I do see sockets being set up for connecting to 127.0.0.1. Is there another repo that does the processing then?
With GazeFlowAPI you can access real-time gaze and head position data from GazePointer WebCam Eye-Tracker
How to use it:
- Install and start GazePointer (download: https://sourceforge.net/projects/gazepointer/)
- To get your AppKey register at https://gazeflow.epizy.com/GazeFlowAPI/register/ You can use default AppKey for testing.
- Connect to GazePointer and start receiving gaze data.