Eyetracking Tobii TX300

Hi @timon,

I think I misunderstood the root issue you were describing. By ‘gap in samples’ do you mean that the TRIAL_END message for trial T-1 is being sent 0.x seconds after the TRIAL_START message for trial T, so the samples between the two messages are not being exported from the hdf5 file? If that is the case, then you can just move when the TRIAL_END messages are being written so that they are right before the next TRIAL_START.

For the first trial, just send the TRIAL_START message in Begin Routine and do not send any message in End Routine.

For trials 2 to N, put both a TRIAL_END and TRIAL_START message in the trial’s Begin Routine, like:

ioServer.sendMessageEvent("TRIAL_END") # ending last trial
ioServer.sendMessageEvent("TRIAL_START") # starting next trial

and, for trials other than the last one, nothing in the End Routine tab.

For the last trial, also send a TRIAL_END message in the “End Routine”

Attached is an untested version of the experiment with the message placement changes made to it.

Eyetracker_Psychopy_experiment_10-10-21_v3.psyexp (37.2 KB)

Please let us know how it goes. thank you

Hi @sol,

sorry for the last response, but I can work with the eyetracker just once a week and then I can test the new ideas. Sadly the new file didnt work for me. I guess you did understand my problem in the first place, but I try to describe it again.
After i ran the experiement and I have my final file with my eyetracking data, there is always a gap in time of like 0.5 seconds, everytime the trial index increases. In example my trial 1 is ending at 72.1760902996175 second and then my trial 2 is beginning at 72.63460599957034 seconds. Sadly this 0.5 seconds that are missing are very important for the experiment, so I cant just overlook them and I dont know how to fix that. The last experiment you send me (eyetracking.psyexp (20.7 KB)
) havent had this problem, but I sadly we dont work with a loop and I dont understand, why there is no time gap in your experiment, but in my case there is: Eyetracker_Psychopy_experiment_10-10-21_v3.psyexp (41.3 KB)

Thank you for your help
Timon

Hi @timon,

In the last example I sent you ( Eyetracker_Psychopy_experiment_10-10-21_v3.psyexp, the trail start for trial T, and the trial end message for T-1, are being sent right after one another in the experiment script, so I do not see how there could be gaps greater than a msec or two in the trial start / stop times.

Do you mind sending me a copy of your most recent experiment project (with any needed resourses to run it) and a hdf5 file saved from running it, so I can see what is going on? You can private message me the link to the experiment if the resourses are private / should not be shared.

Thanks again

1 Like

Hi @timon ,

Thank you for send the files. The issue is that you are still using the ETRecord component in each of your routines. If you look at the experiment I sent, it uses custom code to start recording after calibration and then stop recording at the end of the experiment - it does not use the ET Record components.

remove_etRecords

If you remove the ET Record components from each routine, the Tobii will not stop and restart the eye tracking in between each routine so you will not get a gap in the sample stream.

Modified project attached:

Eyetracker_Psychopy_experiment_10-10-21_v4.psyexp (37.2 KB)

Thank you

1 Like

Hi @sol ,

thank you very much for all of your help, now it runs perfectly. I appreciate that a lot.

Timon

Yahoo, glad to help. Thanks for sticking with it, even with the bumps in the road.

Hi @sol ,

it is me again :smiley: , I have a question to live eyetracking. So we want to see where the pupil is moving in our eyetracking experiement, but the participant, who is sitting in front of the eyetracker should not see his own eye movement, just the person, who is behind the PC, where the psychopy experiment was started. Is this possible with psychopy, because I havent found anything, and do we have to connect the PC and the eyetracker in a maybe different way? Right now we use a hdmi cable, to connect the two devices.

have a great week,
Timon

Hi sol,

I had the same problem but solved it after reading your explanation. Thanks a lot for the uploaded file, especially how to read the eye movement data! really really appreciate that

Hi @sol, thanks so much for all this documentation, it worked perfectly for me. However, we are hoping to include other event types in the .txt file in addition to binocular data, including FixationStartEvent, FixationEndEvent, SaccadeStartEvent, and SaccadeEndEvent. It was easy enough to replace the SAVE_EVENT_TYPE and SAVE_EVENT_FIELDS with a single one of these event types and get that data into a different .txt file, but ideally we would have one .txt file that stored all the data together.

Thanks so much!

@jgeller112, it would be great if you could send me that template experiment!