Running youtube experiment with psychopy and tobii tx300 tracker using tobiisdk3.1.0

Hello everyone, I am new to eye trackling field, i know it may silly question but need your help :slight_smile:
anybody knows how to run youtube experiment with combination of tobiisdk3.1.0 and psychopy??

note : I am using tobiicontroller.py given by Hiroyuki Sogo, modified by Horea Christian and psychopy

I’m not sure I understand what you’d like to do. You’d like people to watch youtube videos while doing eyetracking? Or just any videos?

Psychopy has a class for videos:
http://www.psychopy.org/api/visual/moviestim.html#psychopy.visual.MovieStim

thank you jan for replying, yes only youtube videos. and my requirement is i have to split the each video frame into region and then should track on which region observer looking at. I hope you understand my requirement. thank you.

One thing I would worry about is I would not want to stream the videos from youtube during the experiment, since network streams are highly variable (and you’ll probably get a ton of ads too!).

Unless you have a good reason not to I would download the videos first. Checkout youtube-dl.

hello riggs!!! actually in my requirement there is no concept of watching video on youtube, Here how my requirement goes
Observer should open youtube homepage, on youtube homepage we have several videos in category wise like trending and recommended etc. so we should track the observer pattern on which video(frame) he is looking and then to which video he is moving,
in order to do that i should divide the youtube homepage into several regions(frames) based on number of videos, then should track observer’s eye pattern and then should validate in which region his fixations are present.
so i am able to split the regions but not able to run the experiment with tracker, as i run tobiicontroller.py it directly gives the dots’ experiment. Here i should replace dots’ experiment with youtube page experiment.

If you want to retain good synchronisation between your eye tracking data and what is displayed on the screen and are dead set on using tobiicontroller.py, one option could be to re-build the youtube page you want people to look at in psychopy, using ImageStims and TextStims from the psychopy visual module:
http://www.psychopy.org/api/visual.html

However, generally I think your best bet is to not use psychopy. Tobii Studio is good for recording eye tracking data on top of websites:

Apologies if that’s not what you want - I know how annoying it is when people suggest different software when you plan on using a specific thing. But if you want people to interact with a website (not something psychopy can do) and record data with tobiicontroller.py (not something you need psychopy for) then maybe psychopy isn’t suited for your experiment.

1 Like

hey jan, thank you for your suggestions,. by the way i should not use tobii studio, i should use tobii sdk only to run the experiment so looking for options!!!

I don’t know how to do this in Psychopy, but here’s a different suggestion you can take a look at if you’d like.

It’s an app I wrote that can record your gaze from the TobiiTX300 and save your coordinates along with dwell time. You could open a youtube video page and place it on top of the app’s window and it should still work just fine. It won’t be as precisely temporally coupled with frame by frame presentation as Tobii Studio, but it’s a quick and useful solution nonetheless.

Here’s a tutorial video on youtube that you can look at to get a preview.

1 Like