No problem here, just wanted to show what i did with your help

URL of the experiment: Jakob David Rusche / TernusLoop140 · GitLab
Pronk´s Original: https://run.pavlovia.org/tpronk/demo_eye_tracking2/

Hey PsychoPy-People,

My name is Jakob, and I just wrote my bachelor thesis using the eye-tracking experiment of T. Pronk (2020). I got a lot of help from here, so I wanted to show, what I accomplished. Without your questions and answers, I would not have been able to get my experiment running. Most thanks to @thomas_pronk for his eye-tracking experiment. Without it, I would have been lost in huge amounts of unknown code.
Because I study in Germany all the texts and instructions are in German. My Experiment uses a ternus display with a visual cue, ISI modifications, and two different styles of ternus displays to analyse what influences visual attention and perception.

At first, I wrote the basic experiment without the eye-tracking components. Three discs appear, a certain amount of time passes (ISI= inter stimulus interval), and three discs appear but shifted to the right. Then the participant answers via keyboard if they perceived group-(J-Key) or element-motion(F-Key). After 20 trials the first of nine small blocks is completed. The nine small blocks are then repeated 3 times with a possible break between.
I got the colors of the discs from a previous Experiment, but they were in RGB and for PsychoPy you need sRGB. The two variants project on the same color-space, but RGB ranges from 0 to 255 and sRGB from -1 to 1. The formula to convert between the two is as follows:

x= RGB, y = sRGB
x / 127.5 - 1 = y

So, you take every three numbers of an RGB code and put it in the formula, and you get the sRGB values.
With a visual cue, I wanted to shift the attention to a specific place on the canvas. For the cue we had 5 different conditions in which certain discs got illuminated briefly. With the first element illuminated I wanted to shift the perception toward element-motion and with the second towards group-motion. From the experiment of M. Stepper, B. Rolke and E. Hein in 2020, we know that they got results supporting my assumption. The other conditions were illumination on the third, on all and on no discs. In the future someone needs to do this experiment on a larger scale, I only had 4 participants. I kept the number so low to check if I get useful data at all, because I did not know how the eye-tracking data would look, when I collected data from different computers and different participants.
Enough of these experimental details, let´s get back to the coding.

Now I implemented parts of the code from Pronk. In the beginning between the Instructions about the experiment, I put the calibration part in which 16 squares appear evenly distributed and individual on the screen. The participants were instructed to click at them. At the moment of the click, the program saves the gaze point and “learns” how to calculate the gaze points from the participants` eye positions. Over time the calibration loses accuracy which is why I added more calibrations after each of the three big blocks. In Pronk’s experiment, he suggested adding more squares to the calibration to get more exact results, but for my causes 16 was plenty.

// Save last gaze points

let x = util.sum(window.xGazes) / window.xGazes.length;
let y = util.sum(window.yGazes) / window.yGazes.length;
psychoJS.experiment.addData(xGazes, x – psychoJS.window.size[0] / 2 ) ;
psychoJS.experiment.addData(yGazes, -1 * (y - psychoJS.window.size[1] / 2 ) ) ;

The next part was a bit trickier. In Pronk’s experiment, he took the gaze points and drew a white square at these coordinates. But instead of drawing, I needed to save the gaze points to my output file.

The calculations in the last two rows are for a better display of the output data. If you look in the middle of your screen (1920x1080) the coordinates saved will be 960x540. Pronk also suggested having more calibrations because the accuracy will drop over time, so I added a calibration trial after 9x20 experimental trials to prevent the drop.

This Experiment still has some flaws, but I am happy with the result of my work. I´m not really a programmer myself, but this forum helped me a lot to finish my experiment and my bachelor thesis. Feel free to comment, ask or discuss.

4 Likes

Thanks Jakob, and well done to you and Thomas both. It’s lovely to see people coming to the forum with success stories! :smile: :muscle:

3 Likes

Amazing to read how successfully you run your experiment. Well done

1 Like

Heya, I just wanted to say “cool!” :). I’m waiting for an opportunity to check out your improvements with the attention they deserve, should manage within a week or so.

Gave it a good read and I think it’s double-cool you used the online eye-tracking data in your actual thesis. If I may ask: did you run stats on the gaze data? Did it work out?

If so, it might be fun to post a pre-print of your bachelor-thesis on a pre-print server. Online eye-tracking in the behavioral sciences is still very young, so your results could be very valuable to pioneering researchers!

1 Like