Hi all,
I would like some help figuring out which of these (pylink, pygaze or iohub) are better for my code.
I have a code for a task similar to Posner’s already written using python and psychopy functions. Now I have to add eye tracking functions to interact with a Eyelink 1000+ equipament.
I need basic functions like calibrating and recording the eyes movement during the trial.
There are so many stuff about all of these functions and platforms that I can’t figure out which one would be better, more practical and easy to implement. Because I am not a good python programmer.
Also, if one could show me an example of a code with eye tracking implemented I would really appreciate!
I tried to run the demo that you suggested. It seams to run ok but only directly from Psychopy.
I could not get the code to run from the command prompt, no matter where the file was (python or psychopy folder).
Hm, maybe the dependencies don’t work out right? Are you using the same python that psychopy uses (I.e. what’s the result of typing which python or where python on Windows?
It’s hard to tell without some sort of error message so if you get something please post it.
It’s seems to be running normal now. I have no idea what was the problem before…
Just another question… do you know if it’s possible to control the eye tracker through other PC?
Because with the code I can only get for example to start the calibration using the computer with Eyelink installed that is connected via ethernet to the PC used to run the code.
I’m not sure unfortunately. I have always used the same computer for stimulus presentation and eye tracking before so I don’t know if you can remotely control an eye tracker via ethernet. Maybe someone else here does?
What kind of control are you looking for? If you’re trying to send event markers back to the eye-tracker, I haven’t figured out how to get that to work on an eyelink and I ultimately just did the whole thing in psychopy (because with iohub you can record the output from the eye-tracker in your own custom data structures).
yes, I was hoping to send messages back to eyetracker so the EDF file could have markers about the stimuli presentation and target response.
But perhaps your way could work too. The most important information that I need from the tracker is if the person shift the eyes during the trial.
So, what kinda of data are you recording? Eye movements and time events?
iohub comes with tracker.getLastGazePosition(), which just gets the most recent fixation data from the eye-tracker in a simple [x,y] list. You can’t get velocity directly, as far as I know, but you could definitely compare the fixation location at time t1 with the previous position at time t0 and figure it out that way. I think that will give you what you need.
My program just calls getLastGazePosition on every win.flip() and records the fixation in a straightforward list with some data about the current trial and a timestamp of sorts, and I write that list to a CSV at the end of the experiment. In theory, if you could figure out how to do the timing (I haven’t because 60Hz is sufficient for my study), you could get much higher-resolution data than that.