Hi,
The improvements for using eyetracking with Builder seem to be amazing. Unfortunately for me, I’m having trouble understanding how iohub with eyetracking is supposed to work now. I have an experiment that was running fine in 2021.1.x, but after upgrading to 2021.2.2 it doesn’t work at all. I get this error now:
Traceback (most recent call last):
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 667, in createNewMonitoredDevice
dev_data = self.addDeviceToMonitor(dev_cls_name, dev_conf)
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 818, in addDeviceToMonitor
dev_cls_name)
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\__init__.py", line 935, in import_device
module = __import__(module_path, fromlist=["{}".format(device_class_name)])
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\__init__.py", line 8, in <module>
from .eyetracker import *
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\eyetracker.py", line 10, in <module>
from psychopy.iohub.devices.eyetracker.hw.tobii.tobiiCalibrationGraphics import TobiiPsychopyCalibrationGraphics
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 19, in <module>
class TobiiPsychopyCalibrationGraphics(object):
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 27, in TobiiPsychopyCalibrationGraphics
EventConstants.KEYBOARD_RELEASE).CLASS_ATTRIBUTE_NAMES.index('key')
AttributeError: 'NoneType' object has no attribute 'CLASS_ATTRIBUTE_NAMES'
Error during device creation ....
Traceback (most recent call last):
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 667, in createNewMonitoredDevice
dev_data = self.addDeviceToMonitor(dev_cls_name, dev_conf)
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 818, in addDeviceToMonitor
dev_cls_name)
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\__init__.py", line 935, in import_device
module = __import__(module_path, fromlist=["{}".format(device_class_name)])
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\__init__.py", line 8, in <module>
from .eyetracker import *
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\eyetracker.py", line 10, in <module>
from psychopy.iohub.devices.eyetracker.hw.tobii.tobiiCalibrationGraphics import TobiiPsychopyCalibrationGraphics
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 19, in <module>
class TobiiPsychopyCalibrationGraphics(object):
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 27, in TobiiPsychopyCalibrationGraphics
EventConstants.KEYBOARD_RELEASE).CLASS_ATTRIBUTE_NAMES.index('key')
AttributeError: 'NoneType' object has no attribute 'CLASS_ATTRIBUTE_NAMES'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 601, in _addDevices
self.createNewMonitoredDevice(dev_cls_name, dev_conf)
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 680, in createNewMonitoredDevice
raise ioHubError('Error during device creation ....')
psychopy.iohub.errors.ioHubError: ioHubError:
Args: ('Error during device creation ....',)
This is what I’ve been using to load in the YAML configurations and get ioHub going:
from psychopy.iohub import util, client
config_file = 'tobii_config.yaml'
# specify name of eye tracker (ie the name
# specified in the YAML file)
eye_tracker_name = 'tracker'
# import config file
io_config = util.readConfig(config_file)
# attempt to connect to device specified in
# YAML config file
io_connection = client.ioHubConnection(io_config)
# check if can get details for eye tracker device,
# and otherwise quit
if io_connection.getDevice(eye_tracker_name):
# assign the tracker to a variable to make
# it easier to reference
eye_tracker = io_connection.getDevice('tracker')
else:
print(
f"Could not connect to eye tracker '{eye_tracker_name}', is it on?"
" Quitting..."
)
core.quit()
I cobbled together the above script as best as I could, basing quite a lot of it on what’s described in the official PsychoPy development book. Looking at the eyetracking ‘builder demo’ from the most recent PsychoPy version, it seems that the more standard way to do things now is to use io.launchHubServer
and its associated methods, but doing so didn’t help me forward, and the experiment crashed again. That time I think I got a message about how PsychoPy for some reason was trying to find a folder named ‘monitor_devices’. I assume this means that ioHub now parses YAML file/config dictionary contents differently.
This is the YAML file I have been using
# configurations for iohub (ie eyetracker-related configurations, in this
# case) are based on the template 'Default Device Settings' @
# https://www.psychopy.org/api/iohub/device/eyetracker_interface/Tobii_Implementation_Notes.html
monitor_devices:
- Display:
name: EIZOFlexScanEV2451
reporting_unit_type: pix
device_number: 0
physical_dimensions:
width: 527
height: 296
unit_type: mm
default_eye_distance:
surface_center: 1000
unit_type: mm
psychopy_monitor_name: spectrum_monitor
- Experiment:
name: ease_infant_audiovisual
# tobii eye tracker configuration
- eyetracker.hw.tobii.EyeTracker:
# Indicates if the device should actually be loaded at experiment runtime.
enable: True
# The variable name of the device that will be used to access the ioHub Device class
# during experiment run-time, via the devices.[name] attribute of the ioHub
# connection or experiment runtime class.
name: tracker
# Should eye tracker events be saved to the ioHub DataStore file when the device
# is recording data ?
save_events: True
# Should eye tracker events be sent to the Experiment process when the device
# is recording data ?
stream_events: True
# How many eye events (including samples) should be saved in the ioHub event buffer before
# old eye events start being replaced by new events. When the event buffer reaches
# the maximum event length of the buffer defined here, older events will start to be dropped.
event_buffer_length: 1024
# The Tobii implementation of the common eye tracker interface supports the
# BinocularEyeSampleEvent event type.
monitor_event_types: [ BinocularEyeSampleEvent,]
# The model name of the Tobii device that you wish to connect to can be specified here,
# and only Tobii systems matching that model name will be considered as possible candidates for connection.
# If you only have one Tobii system connected to the computer, this field can just be left empty.
model_name:
# The serial number of the Tobii device that you wish to connect to can be specified here,
# and only the Tobii system matching that serial number will be connected to, if found.
# If you only have one Tobii system connected to the computer, this field can just be left empty,
# in which case the first Tobii device found will be connected to.
serial_number:
calibration:
# Should the PsychoPy Window created by the PsychoPy Process be minimized
# before displaying the Calibration Window created by the ioHub Process.
#
minimize_psychopy_win: False
# The Tobii ioHub Common Eye Tracker Interface currently support
# a 3, 5 and 9 point calibration mode.
# THREE_POINTS,FIVE_POINTS,NINE_POINTS
#
type: NINE_POINTS
# Should the target positions be randomized?
#
randomize: True
# auto_pace can be True or False. If True, the eye tracker will
# automatically progress from one calibration point to the next.
# If False, a manual key or button press is needed to progress to
# the next point.
#
auto_pace: True
# pacing_speed is the number of sec.msec that a calibration point should
# be displayed before moving onto the next point when auto_pace is set to true.
# If auto_pace is False, pacing_speed is ignored.
#
pacing_speed: 1.5
# screen_background_color specifies the r,g,b background color to
# set the calibration, validation, etc, screens to. Each element of the color
# should be a value between 0 and 255. 0 == black, 255 == white.
#
screen_background_color: [128,128,128]
# Target type defines what form of calibration graphic should be used
# during calibration, validation, etc. modes.
# Currently the Tobii implementation supports the following
# target type: CIRCLE_TARGET.
# To do: Add support for other types, etc.
#
target_type: CIRCLE_TARGET
runtime_settings:
# The supported sampling rates for Tobii are model dependent.
sampling_rate: 1200
# Tobii implementation supports BINOCULAR tracking mode only.
track_eyes: BINOCULAR
# manufacturer_name is used to store the name of the maker of the eye tracking
# device. This is for informational purposes only.
manufacturer_name: Tobii Technology
# specify ioHub data storage options
data_store:
enable: True
experiment_info:
title: Visual search with eye tracking
session_info:
code: NEW_SUBJECT
For now I’m downgrading to 2021.1.4, but if possible I’d like to update the experiment so that it’s easier to build on for colleagues in the future. Are there any materials to help with making the jump from 2021.1.x to 2021.2.0 and beyond? Eg a collection of YAML files that describe how standard YAML files for eyetracking experiments should look. Here I mean files that don’t include just the eyetracker, but the monitor or keyboard as well, as I’ve been struggling to find what format is expected for the full file.
In conjunction with this, is there a sample experiment/file that describes in code how one should now import iohub specifications from a YAML file? The only example I could find in this last version of PsychoPy is the ‘stroop keyboard’ Builder demo. That one doesn’t involve an eyetracker, and it seems to be importing the configurations in an older style, as it doesn’t use the aforementioned ‘launchHubServer’ and uses a somewhat convoluted method for reading the contents of the YAML file.
Any tips or suggestions are highly appreciated Even just a link to something like a github repo for an experiment that uses eyetracking in PsychoPy 2021.2.x would help a lot.