Eye-tracking development for Pavlovia

This thread about eye tracking in PsychoJS/Pavlovia has received a lot of posts. Here is a little recap:

  • I have examined some libraries for eye-tracking via webcams and selected webgazer as presently being the most suitable one. Here is a paper examining how well a relatively an old version of webgazer works in three cognitive tasks.
  • The experiment demo_eyetracking2 illustrates how to use webgazer with PsychoJS. This experiment includes a calibration procedure and a gaze-tracking procedure.
  • Demo_eyetracking2 can be freely cloned and modified. Researchers have already adapted the experiment for their own needs, with discussion about one such adaptation in another thread on this forum.
  • An OST colleage has made a 5-step tutorial on how to customize demo_eytracking2, which can be found in this tweet.
6 Likes

Hi, You can also try to gather the data using RealEye.io - itā€™s very easy - no coding required.
Then you can export what you need as a CSV file and process it in any way you like.

Thank you for all your work on this. Do you know of a demo or experiment that has implemented this and logs (x,y) coordinates to an output file? Have been trying to do this, but have not been able to. Thanks.

Do you think this could be used to track eye movements while reading a text? I donā€™t need it to be particularly precise; I just want to be able to tell whether their eyes are moving across the page or if they have stopped reading.

1 Like

Hi @dpg45,

I think it would be tricky to establish whether the eyes are moving or not, but establishing whether participants are looking at the screen or not should be doable. Actually, another researcher has been developing something like that, which we discussed in another thread.

Best, Thomas

1 Like

Webcam-based implementation would be very difficult. Look into the differences between shape-based eye tracking such as this (this is based on Papoutsakiā€™s dissertation (2016)) and corneal reflection eye tracking (which is what you see in more expensive, laboratory, or over-the-counter eye tracking equipment). Corneal-reflection based eye tracking is more accurate. The type of eye tracking being implemented here is shape-based meaning it identifies a contrast between the cornea and other regions of the face. It has a higher degree of error because it is more susceptible to environmental factors such as lighting and head movements.

WebGazer, which is used here, is reliant on ā€œimplicitā€ click-based calibration that continues to calibrate as you continue to click to keep the predictions based on the last ten or so clicks. If you have the individual engage in a passive task, the predictions will lose accuracy over time (I donā€™t have an exact number). You could include mandatory clicks at the end of each line or something, but then youā€™d greatly impact reading fluency, both in terms of saccades and fixations.

Depending on your research question, this may not be an issue. But for any reading passages that are more than a line, the accuracy will be questionable.

2 Likes

Thank you both so much for your replies.

@ayjayar , do you know how WebGazer compares with GazeRecorder? Is that also shape-based? From the demo, GazeRecorder doesnā€™t seem to use click-based calibration, but I donā€™t know if it will also lose accuracy over time.

With the pandemic, I am trying to find something that can be used remotely. I would not need to analyze the data at the level of saccades. Basically, what I am trying to tell is if the participantā€™s eyes are moving across the page vs. if they are staring into space/eyes have stopped moving. Do you think WebGazer, or any other webcam based tracker, is able to achieve this?

I am not trying to hijack the thread, so please feel free to message me directly if youā€™d prefer. Iā€™d greatly appreciate your insight.

@dpg45 can tell you Iā€™ve messed with GazeRecorder early on and it has many advantages and disadvantages, mainly the fact it requires software download unlike webgazer, and it is all locally managed, whereas WebGazer can be on any webpage, but it also automatically records video of the interaction.

Iā€™d recommend starting a new thread, as this thread is specifically meant to stay on-topic of just using eye tracking in Pavlovia, without referencing other specific software. Iā€™ll message you more if you wish.

1 Like

I just realized that there is actually a discussion of other ones (e.g. GazeCloud) earlier in the thread. My mistake for not reading more carefully!

2 Likes

Thank you for all your work, @thomas_pronk. What would be the best way to ā€œturn offā€ WebGazer in your demo after you no longer need it? E.g., subsequent trials

I think webgazer.end() would be a good one for that. See their API documentation over here: Top Level API Ā· brownhci/WebGazer Wiki Ā· GitHub

2 Likes

Does webgazer report head position? Iā€™ve been looking through the webgazer docs and canā€™t tell. Seems like this information would be necessary but I donā€™t see a way to access it
Thank you in advance :slight_smile: \

Hey @r.ward,

I also took a look and this is what I found:

  1. In the wiki I found the function getPositions(): Tracker API Ā· brownhci/WebGazer Wiki Ā· GitHub
  2. Which I tried out by running demo_eye_tracking2, opening the console, and running: webgazer.getTracker().getPositions()
  3. This returns an array with 468 elements, each of which is an array of 3 elements. Looks like 3D coordinates?
  4. Finally, I see in the library that getPositions() is used to draw the face overlay: WebGazer/index.mjs at 7ff29a32b12048362750d0594ecf8375dcdd22a0 Ā· brownhci/WebGazer Ā· GitHub

Soā€¦ itā€™s there indeed, but not in a very easy-to-use format :slight_smile: What you could do is post an issue in the webgazer repo to ask for what you need; the team is very approachable. To be sure we give them some good specs to work with, it can be useful to think it through a bit. What kind of position data would you like exactly?

Best, Thomas

Hello guys,

I am a newbie to PsychoPyJS. I am a student and I am trying to build a small experiment where I show an image and capture frames using webcam. I have a code in JavaScript where I am already doing it but I want to integrate it in PsychoPyJS. I have been trying to do this using the Builder tool but I couldnā€™t do it. Could you guys please direct me to relevant document or tutorial? Any help would be highly appreciated.

Thanks,
SPJ

Hey @spj,

Is the capturing related to eye-tracking or is it about capturing video in general?

@thomas_pronk It is for eye-tracking but aim of using web-cam is to capture images and send it to a server for processing. I have written a simple server side code in Python. It processes these frames and sends response to front-end.

Hi Thomas,

I want to create an eyetracking experiment and as wanted to run you demo_eye_tracking2, I get the following error message:

 File "\github\demo_eye_tracking2\demo_eye_tracking2_lastrun.py", line 31, in <module>
    ale
NameError: name 'ale' is not defined
##### Experiment ended. #####

I am new tp python but Iā€™ve already installed the alepython module but it does not work. Which module do you refer to in this line called ale?

Hi Esther, Iā€™m afraid that my demo only works online (PsychoJS). For offline, weā€™re working on improving the way you can integrate eye-tracking, but I think it wonā€™t support webcams (only dedicated hardware like Tobii). I asked around and Iā€™ll update this post once Iā€™m sure about that. Update: yes, the offline version will only support dedicated hardware

Hello Thomas,

I have dabbled with the webgazer.js for over a month now and as far as I understand the triplet in the array you were mentioning corresponds to each of the following

  • x co-ordinate
  • y co-ordinate
  • latest eye features/pupil features.
return predictions[regModelIndex] === null ? null : {
      'x' : predictions[regModelIndex].x,
      'y' : predictions[regModelIndex].y,
      'eyeFeatures': latestEyeFeatures
    };

I hope this helps.


Saketh.

1 Like

If the goal is to draw heatmaps, we could simply use the predictions from webgazer.js and html canvas to over lay the tracked points. Thereā€™s an opensource library that I have recently found, called simpleheat.js

I am new to pavlovia and havenā€™t tried integrating it into a psychopy experiment yet, but thereā€™s a demo I built at work hosted here.
The code can be found here

I saw your in-pavlovia tracking example, trying to adapt it to download the simpleheat library and do things inside the pavlovia ecosystem.

1 Like