Guidance on using eyetracking with ioHub and YAML files post 2021.2.0

Hi,

The improvements for using eyetracking with Builder seem to be amazing. Unfortunately for me, I’m having trouble understanding how iohub with eyetracking is supposed to work now. I have an experiment that was running fine in 2021.1.x, but after upgrading to 2021.2.2 it doesn’t work at all. I get this error now:

Traceback (most recent call last):
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 667, in createNewMonitoredDevice
    dev_data = self.addDeviceToMonitor(dev_cls_name, dev_conf)
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 818, in addDeviceToMonitor
    dev_cls_name)
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\__init__.py", line 935, in import_device
    module = __import__(module_path, fromlist=["{}".format(device_class_name)])
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\__init__.py", line 8, in <module>
    from .eyetracker import *
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\eyetracker.py", line 10, in <module>
    from psychopy.iohub.devices.eyetracker.hw.tobii.tobiiCalibrationGraphics import TobiiPsychopyCalibrationGraphics
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 19, in <module>
    class TobiiPsychopyCalibrationGraphics(object):
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 27, in TobiiPsychopyCalibrationGraphics
    EventConstants.KEYBOARD_RELEASE).CLASS_ATTRIBUTE_NAMES.index('key')
AttributeError: 'NoneType' object has no attribute 'CLASS_ATTRIBUTE_NAMES'
Error during device creation ....
Traceback (most recent call last):
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 667, in createNewMonitoredDevice
    dev_data = self.addDeviceToMonitor(dev_cls_name, dev_conf)
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 818, in addDeviceToMonitor
    dev_cls_name)
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\__init__.py", line 935, in import_device
    module = __import__(module_path, fromlist=["{}".format(device_class_name)])
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\__init__.py", line 8, in <module>
    from .eyetracker import *
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\eyetracker.py", line 10, in <module>
    from psychopy.iohub.devices.eyetracker.hw.tobii.tobiiCalibrationGraphics import TobiiPsychopyCalibrationGraphics
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 19, in <module>
    class TobiiPsychopyCalibrationGraphics(object):
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\devices\eyetracker\hw\tobii\tobiiCalibrationGraphics.py", line 27, in TobiiPsychopyCalibrationGraphics
    EventConstants.KEYBOARD_RELEASE).CLASS_ATTRIBUTE_NAMES.index('key')
AttributeError: 'NoneType' object has no attribute 'CLASS_ATTRIBUTE_NAMES'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 601, in _addDevices
    self.createNewMonitoredDevice(dev_cls_name, dev_conf)
  File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\server.py", line 680, in createNewMonitoredDevice
    raise ioHubError('Error during device creation ....')
psychopy.iohub.errors.ioHubError: ioHubError:
Args: ('Error during device creation ....',)

This is what I’ve been using to load in the YAML configurations and get ioHub going:

from psychopy.iohub import util, client

config_file = 'tobii_config.yaml'

# specify name of eye tracker (ie the name
# specified in the YAML file)
eye_tracker_name = 'tracker'

# import config file
io_config = util.readConfig(config_file)

# attempt to connect to device specified in
# YAML config file
io_connection = client.ioHubConnection(io_config)

# check if can get details for eye tracker device,
# and otherwise quit
if io_connection.getDevice(eye_tracker_name):
    # assign the tracker to a variable to make
    # it easier to reference
    eye_tracker = io_connection.getDevice('tracker')
else:
    print(
        f"Could not connect to eye tracker '{eye_tracker_name}', is it on?"
        " Quitting..."
    )
    core.quit()

I cobbled together the above script as best as I could, basing quite a lot of it on what’s described in the official PsychoPy development book. Looking at the eyetracking ‘builder demo’ from the most recent PsychoPy version, it seems that the more standard way to do things now is to use io.launchHubServer and its associated methods, but doing so didn’t help me forward, and the experiment crashed again. That time I think I got a message about how PsychoPy for some reason was trying to find a folder named ‘monitor_devices’. I assume this means that ioHub now parses YAML file/config dictionary contents differently.

This is the YAML file I have been using

# configurations for iohub (ie eyetracker-related configurations, in this
# case) are based on the template 'Default Device Settings' @
# https://www.psychopy.org/api/iohub/device/eyetracker_interface/Tobii_Implementation_Notes.html

monitor_devices:
    - Display:
        name: EIZOFlexScanEV2451
        reporting_unit_type: pix
        device_number: 0
        physical_dimensions:
            width: 527
            height: 296
            unit_type: mm
        default_eye_distance:
            surface_center: 1000
            unit_type: mm
        psychopy_monitor_name: spectrum_monitor

    - Experiment:
        name: ease_infant_audiovisual

    # tobii eye tracker configuration
    - eyetracker.hw.tobii.EyeTracker:
        # Indicates if the device should actually be loaded at experiment runtime.
        enable: True

        # The variable name of the device that will be used to access the ioHub Device class
        # during experiment run-time, via the devices.[name] attribute of the ioHub
        # connection or experiment runtime class.
        name: tracker

        # Should eye tracker events be saved to the ioHub DataStore file when the device
        # is recording data ?
        save_events: True

        # Should eye tracker events be sent to the Experiment process when the device
        # is recording data ?
        stream_events: True

        # How many eye events (including samples) should be saved in the ioHub event buffer before
        # old eye events start being replaced by new events. When the event buffer reaches
        # the maximum event length of the buffer defined here, older events will start to be dropped.
        event_buffer_length: 1024

        # The Tobii implementation of the common eye tracker interface supports the
        # BinocularEyeSampleEvent event type.
        monitor_event_types: [ BinocularEyeSampleEvent,]

        # The model name of the Tobii device that you wish to connect to can be specified here,
        # and only Tobii systems matching that model name will be considered as possible candidates for connection.
        # If you only have one Tobii system connected to the computer, this field can just be left empty.
        model_name:

        # The serial number of the Tobii device that you wish to connect to can be specified here,
        # and only the Tobii system matching that serial number will be connected to, if found.
        # If you only have one Tobii system connected to the computer, this field can just be left empty,
        # in which case the first Tobii device found will be connected to.
        serial_number:

        calibration:
            # Should the PsychoPy Window created by the PsychoPy Process be minimized
            # before displaying the Calibration Window created by the ioHub Process.
            #
            minimize_psychopy_win: False

            # The Tobii ioHub Common Eye Tracker Interface currently support 
            # a 3, 5 and 9 point calibration mode.
            # THREE_POINTS,FIVE_POINTS,NINE_POINTS
            #
            type: NINE_POINTS

            # Should the target positions be randomized?
            #
            randomize: True

            # auto_pace can be True or False. If True, the eye tracker will 
            # automatically progress from one calibration point to the next.
            # If False, a manual key or button press is needed to progress to
            # the next point.
            #
            auto_pace: True
        
            # pacing_speed is the number of sec.msec that a calibration point should
            # be displayed before moving onto the next point when auto_pace is set to true.
            # If auto_pace is False, pacing_speed is ignored.
            #
            pacing_speed: 1.5
        
            # screen_background_color specifies the r,g,b background color to 
            # set the calibration, validation, etc, screens to. Each element of the color
            # should be a value between 0 and 255. 0 == black, 255 == white.
            #
            screen_background_color: [128,128,128]
        
            # Target type defines what form of calibration graphic should be used
            # during calibration, validation, etc. modes.
            # Currently the Tobii implementation supports the following
            # target type: CIRCLE_TARGET. 
            # To do: Add support for other types, etc.
            #
            target_type: CIRCLE_TARGET
    
        runtime_settings:
            # The supported sampling rates for Tobii are model dependent.
            sampling_rate: 1200

            # Tobii implementation supports BINOCULAR tracking mode only.
            track_eyes: BINOCULAR
            
        # manufacturer_name is used to store the name of the maker of the eye tracking
        # device. This is for informational purposes only.
        manufacturer_name: Tobii Technology

# specify ioHub data storage options
data_store:
    enable: True
    experiment_info:
        title: Visual search with eye tracking
    session_info:
        code: NEW_SUBJECT

For now I’m downgrading to 2021.1.4, but if possible I’d like to update the experiment so that it’s easier to build on for colleagues in the future. Are there any materials to help with making the jump from 2021.1.x to 2021.2.0 and beyond? Eg a collection of YAML files that describe how standard YAML files for eyetracking experiments should look. Here I mean files that don’t include just the eyetracker, but the monitor or keyboard as well, as I’ve been struggling to find what format is expected for the full file.

In conjunction with this, is there a sample experiment/file that describes in code how one should now import iohub specifications from a YAML file? The only example I could find in this last version of PsychoPy is the ‘stroop keyboard’ Builder demo. That one doesn’t involve an eyetracker, and it seems to be importing the configurations in an older style, as it doesn’t use the aforementioned ‘launchHubServer’ and uses a somewhat convoluted method for reading the contents of the YAML file.

Any tips or suggestions are highly appreciated :slight_smile: Even just a link to something like a github repo for an experiment that uses eyetracking in PsychoPy 2021.2.x would help a lot.

I think we’ll need @sol to help on this one

When you are using Builder you do not need to worry about creating an iohub configuration file / dict; Builder handles this for you. Use the Eye Tracker tab in Builder Experiment Settings to set which eye tracker you want to use and then configure it.

When writting your experiment in Python / using Coder, using launchHubServer has been the suggested way of starting iohub for several years now. Which demo / example uses the client.ioHubConnection directly? I should update that if possible.

I think the exception you are getting is releated to using client.ioHubConnection directly. You can change your example to use launchHubServer as follows, which seems to fix the error for me:

from psychopy.iohub import launchHubServer

config_file = 'tobii_config.yaml'

# ...

# Start iohub using 'tobii_config.yaml'
io_connection = launchHubServer(iohub_config_name=config_file)

# ....

The latest iohub Coder demo’s for eye tracking can be found in the psychopy\psychopy\demos\coder\iohub\eyetracking folder. Starting in 2021.2.x the demo’s create the eye tracker device configuration as a python dict instead of in a yaml file. The format is the same, just skipping the yaml → dict step that was being done by util.readConfig(config_file) in your example. You can do either really. The psychopy\psychopy\demos\coder\iohub\eyetracking\validation.py demo gives an example with many of the calibration settings for each eye tracker, including some additions made in 2021.2.x.

Unfortunately I just noticed that the iohub doc pages seem to be somewhat broken right now (missing tables, …) so they will be of even less use than normal right now. :wink: I’ll work on fixing that in September.

If you look at the latest iohub Coder demo’s, you can see that launchHubServer now accepts the psychopy experiment window being used. When this is used, and it has a valid Monitor Center config file, iohub Display settings are updated to match that of the psychopy experiment window, so you do not need to create a seperate iohub Display device config anymore.

Thank you.

1 Like

Thanks a lot for the reply,

I’m in the process of trying to make the experiment work with the latest version of PsychoPy now and will update this reply as I go.

When writting your experiment in Python / using Coder, using launchHubServer has been the suggested way of starting iohub for several years now. Which demo / example uses the client.ioHubConnection directly? I should update that if possible.

It doesn’t include eyetracking but this .py file and its corresponding .psyexp file use the ioHubConnection method. This seems to be the only demo that does this(BTW, Github’s ‘within-repo search’ functionality is great for finding things like this, if you’re not already using it). Also, the examples in the official PsychoPy book by jon and Michael do the same thing, IIRC. (I don’t think the book has been updated in a few years though)

Updated experiment
I tried changing the code as you suggested, and using the new ‘mouse/mock eyetracker’ (since I’m not at a computer with an eyetracker now). I wanted to first try changing just the code, since I think switching over to the Builder components for eyetracking would lead to chain effects where more has to be restructured.

from psychopy.iohub import launchHubServer

# (this is automatically generated by Builder. there is a valid 'spectrum_monitor'
# specification file on the computer I'm using
win = visual.Window(
    size=[1920, 1080], fullscr=True, screen=0, 
    winType='pyglet', allowGUI=False, allowStencil=False,
    monitor='spectrum_monitor', color=[1,1,1], colorSpace='rgb',
    blendMode='avg', useFBO=True, 
    units='deg'
)

config_file = 'MOUSE_eyetracker.yaml'

eye_tracker_name = 'tracker'

io_connection = launchHubServer(
    iohub_config_name=config_file,
    window=win
)

if io_connection.getDevice(eye_tracker_name):
    # assign the tracker to a variable to make
    # it easier to reference
    eye_tracker = io_connection.getDevice('tracker')
else:
    print(
        f"Could not connect to eye tracker '{eye_tracker_name}', is it on?"
        " Quitting..."
    )
    core.quit()

(Note that I’m passing in the window instance now, based on launchHubServer’s documentation and what you wrote)

Here’s the ‘MOUSE_eyetracker.yaml’ file, after being stripped of redundant sections, in case it helps others who might want to still use the separate YAML file approach.

# SIMULATES eyetracker by using the mouse
monitor_devices:
    - eyetracker.hw.mouse.EyeTracker:
        enable: True
        name: tracker
        controls:
            move: [LEFT_BUTTON,]
            blink: [LEFT_BUTTON, RIGHT_BUTTON]
            saccade_threshold: 0.5
        monitor_event_types: [ MonocularEyeSampleEvent, FixationStartEvent, FixationEndEvent, SaccadeStartEvent, SaccadeEndEvent, BlinkStartEvent, BlinkEndEvent]
data_store:
    enable: True
    experiment_info:
        title: Infant visual search with audio stimuli and eye tracking
    session_info:
        code: EXPERIMENT_RUN

This at first caused an error, but it turned out that it had to do with me forgetting to change the ‘display number’ to 1 in Builder (0 in the code above), since I’m now testing directly on my laptop’s screen (very similar to what’s discussed here), and a separate issue with the PTB sound engine.

Everything now works as expected. Having the mouse/mock eyetracker built into PsychoPy/ioHub directly is brilliant and really helps for development.

Thanks very much for the info and update.

I think the stroop_keyboard demo can be removed completely from the demo set since it was written before PsychoPy Builder used the psychopy.hardware.keyboard.Keyboard class for keyboard events.

Since client.ioHubConnection was used in the book example, I should atleast try to get the approach working again in 2021.2.x and update the docs to make it clearer that client.ioHubConnection is not intended for direct creation anymore and launchHubServer should be used instead.

1 Like

Hi again,

I just noticed that no HDF5 files (“events.hdf5”) are created, at least when I run the experiment using the ‘mouse eyetracker’ (can’t test with a real one atm). Is this as intended for the ‘mouse’ eyetracker? Is there any way to force PsychoPy/ioHub to save the data anyway?

I tried updating the YAML file with any configurations I thought might be needed to tell ioHub to save the eyetracker data:

monitor_devices:
    - Experiment:
        name: ease_infant_audiovisual

    - eyetracker.hw.mouse.EyeTracker:
        enable: True

        name: tracker

        save_events: True

        stream_events: True

        event_buffer_length: 1024

        controls:
            move: [LEFT_BUTTON,]
            blink: [LEFT_BUTTON, RIGHT_BUTTON]
            saccade_threshold: 0.5
        monitor_event_types: [ MonocularEyeSampleEvent, FixationStartEvent, FixationEndEvent, SaccadeStartEvent, SaccadeEndEvent, BlinkStartEvent, BlinkEndEvent]

data_store:
    enable: True
    experiment_info:
        title: Infant visual search with audio stimuli and eye tracking
    session_info:
        code: EXPERIMENT_RUN

But this doesn’t seem to have any effect. I also looked at the experiment demo’s, and the API documentation, but I didn’t find anything that made it really clear how to tell the data to be sent to an HDF5 file.

Here’s what I’m doing just before the experiment ends (ie core.quit())

# tell the device object to stop recording
eye_tracker.setRecordingState(False)
# tell the connection/object resulting from launchHubServer call
# to 'quit'
io_connection.quit() 

I’m guessing I’m missing something obvious here.

Could you send the full Builder project or python script you are using? If using launchHubServer, you need to include the experiment_code kwarg, like is done in the gcCursor Coder demo.

io_hub = launchHubServer(window=window, experiment_code='gc_cursor', **devices_config)

If you ae using Builder and 2021.2.x, I would really suggest just start using the built in eye tracking functionality in Builder, even to just configure the eye tracker / start iohub and then use custom code to access the eye tracker and write custom logic like you did before.

If using one of the included Builder demos that uses the new built in eye tracking functionality, you need to enable ‘Save hdf5 file’ in the experiment Properties → Data tab.

If wanting to test using a Coder / Python experiment, running the demos/coder/iohub/eyetracking/gcCursor demo will save an hdf5 file using the MouseGaze simulated tracker by default.

Thanks again

1 Like

Thanks again,

I just tried adding the experiment_code argument, which led to HDF5 files being generated, as expected. When I did this I also noticed that when using the updated/launchHubServer approach, a separate HDF5 file is saved for each participant. That’s really good, because dealing with the ‘events.hdf5’ file (which could include data from multiple participants) was something of a hassle. The files are still saved in the root project directory by default, but using the datastore_name argument it was fairly easy to reconfigure the experiment so that HDF5 files are now saved to the ‘data’ directory. Unless there’s a particular reason not to do this, I think it would make sense to let saving generated files to ‘data’ be the default setting, and base file naming on the same data/format as PsychoPy Builder projects do for other data files (‘participantname_expname_year_month_day_HHMM’)

I hear you re: using the built in eye tracking functionality. I was a bit confused by the demos/Builder components, probably because I’d become accustomed to my own way of interacting with iohub, but if I run into any trouble again I’ll try it out.

Could you send the full Builder project or python script you are using?

I can’t share the experiment publicly, because of copyright related to some of the stimuli used. I’ve sent you an invite (user AnonZebra) on GitHub for giving you access to the repo though. (for anyone else reading this: the plan is to, in time, modify the github repo so that it in itself doesn’t include any of the copyrighted images. I’ll try to remember to update this post with a link once this is done)

UPDATE: I’ve finally had the time to create a ‘shareable’ version of the experiment mentioned above, please go to https://github.com/datalowe/psychopy-infant-audiovis-et if you’re interested.

1 Like

I just tried adding the experiment_code argument, which led to HDF5 files being generated

Excellent

The files are still saved in the root project directory by default,…

If using Builder (python generated code), it is saved to the data folder with the rest of the data files and is also given the same name as the other data files (but with .hdf5). I think that makes sense since Builder experiments always have a data folder, but non Builder experiments may not, or may use a folder by a diffferent name.

If using Builder (python generated code), it is saved to the data folder with the rest of the data files and is also given the same name as the other data files (but with .hdf5). I think that makes sense since Builder experiments always have a data folder, but non Builder experiments may not, or may use a folder by a diffferent name.

Ah, that explains it. I see what you mean.

I am quite confused about the eye-tracker in psychopy builder for the Eyetracking part seems not to have a place to put the hardware location. Really need the updated documentation. Really expecting the new documentation ::

@disadone
I’m not sure what you mean by ‘hardware location’, and I agree that many things are confusing at the moment, but a few things are explained if you try the demo ‘Feature Demos->eyetracking’ (a short README pops up as you open the experiment). As mentioned there, eyetracker specifications are expected to be done in ‘Experiment settings->Eyetracking’. There you can choose type of eyetracker, specify sampling rate, and give some additional information.

@sol So far I had run our experiment and the demos on the experiment computer, and this works fine now, with ‘MouseGaze’ as well as an actual Tobii eyetracker. This computer runs Windows and is a Lenovo Legion 5 laptop. However, I also use my personal MacBook for development. I just tried running the experiment as well as the eyetracking Demo experiment mentioned above on the Mac. Both cause the same type of error, I’m including the output from attempting to run the eyetracking Demo below:

## Running: /Users/workingman/Documents/Programming/python/psychopy/psychopy_demos_v2021_2_3/Feature Demos/eyetracking/eyetracking_lastrun.py ##
355.3329     INFO     Loaded monitor calibration from ['2020_09_29 09:17']
0.9441     WARNING     We strongly recommend you activate the PTB sound engine in PsychoPy prefs as the preferred audio engine. Its timing is vastly superior. Your prefs are currently set to use ['sounddevice', 'PTB', 'pyo', 'pygame'] (in that order).
ioHub Server Process Completed With Code:  -11
2021-09-01 11:26:19.696 python[25799:1241390] ApplePersistenceIgnoreState: Existing state will not be touched. New state will be written to /var/folders/7q/7fxv4sl52jj6v9ghyt1zwvzh0000gq/T/org.opensciencetools.psychopy.savedState
Traceback (most recent call last):
  File "/Users/workingman/Documents/Programming/python/psychopy/psychopy_demos_v2021_2_3/Feature Demos/eyetracking/eyetracking_lastrun.py", line 96, in <module>
    ioServer = io.launchHubServer(window=win, **ioConfig)
  File "/Applications/PsychoPy.app/Contents/Resources/lib/python3.6/psychopy/contrib/lazy_import.py", line 120, in __call__
    return obj(*args, **kwargs)
  File "/Applications/PsychoPy.app/Contents/Resources/lib/python3.6/psychopy/iohub/client/connect.py", line 290, in launchHubServer
    return ioHubConnection(iohub_config)
  File "/Applications/PsychoPy.app/Contents/Resources/lib/python3.6/psychopy/iohub/client/__init__.py", line 289, in __init__
    raise RuntimeError('Error starting ioHub server: {}'.format(self.iohub_status))
RuntimeError: Error starting ioHub server: ioHub startup failed.
##### Experiment ended. #####

I get the same error regardless of whether I use the ‘testMonitor’ monitor specification, or a custom ‘macMonitor’ specification which is tailored to my Mac and otherwise works as expected.

Here are the specs of my Mac:

  • OS: macOS Big Sur version 11.4
  • Model: MacBook Pro (Retina, 13-inch, Early 2015)
  • Processor: 2.7GHz Dual-Core Intel Core i5
  • Memory: 8GB 1867MHz DDR3
  • Graphics: Intel Iris Graphics 6100 1536MB
  • Serial Number: C02PJ38HFVH5
  • Display: Built-in Retina Display

I suspect that the retina display is what’s causing issues. I’ve had trouble with this and PsychoPy in the past. I’ll try testing with an external display, too. Update: Nope, I get the exact same error even if I try running the demo experiment with an external display and Experiment ‘Screen’ settings updated accordingly. Non-ioHub Demo experiments work as expected with either screen. Maybe ioHub is trying to tap into some particular functionality/data and runs into issues with Mac’s aggressive security/privacy/permission policies? I’m not being prompted to allow something extra, but I’ve had problems with this too before. Apple doesn’t make things easy :sweat:

Edit: I should mention that the Mac could run the experiment with PsychoPy v2021.1.x, when I was mocking the eyetracker using the method we discussed [in this thread]. It’s only now that I’m trying to use PsychoPy v2021.2.3 and the more sophisticated built-in mocking that I’m running into issues.

(if you want, I can start a separate thread about this issue, or write it as a GitHub issue instead)

You do need to add the PsychoPy app to the macos security list (I’m not in front of a mac right now so I’m not using the exact terminology) or iohub will not get keyboard or mouse events.

Does the experiment give the exception very quickly after you start it, or more like after 30 seconds or so?

Can you try running a non eye tracking iohub Coder demos on your macOS computer, like psychopy/demos/coder/iohub/keyboard.py, and see if that gives the same error?

Update: On my 2020 macbookpro the Builder and Coder eye tracker demos seem to run fine; tested with PsychoPy 2021.2.3 Standalone for macOS.

@Arboc, can you please confirm that the Feature Demos/eyetracking demo fails for you on macOS even when no changes are made to the demo source? If it is, what error are you getting?

Thank you

Thank you! @Arboc . I never knew Experiment settings->Eyetracking before. Thanks. The demo is not about builder in 2021.2.3. I really want an example about using the builder.

To get the latest Builder demo’s, please use the “Demos → Unpack Demos…” menu option from Builder, v 2021.2.1 or above, and select where you want the latest demos to be saved. The eye tracking demo should be available in the latest demos.
Thank you

Hi, thanks for having a look at this too,

You do need to add the PsychoPy app to the macos security list (I’m not in front of a mac right now so I’m not using the exact terminology) or iohub will not get keyboard or mouse events.

I checked and PsychoPy already had the following permissions: ‘Accessibility’, ‘Input Monitoring’, ‘Full Disk Access’. I’m guessing this should be all that’s needed.

Does the experiment give the exception very quickly after you start it, or more like after 30 seconds or so?

The latter. During those 30-60 seconds, the experiment is fullscreen and displaying just the grey background, until the error I described occurs.

@Arboc, can you please confirm that the Feature Demos/eyetracking demo fails for you on macOS even when no changes are made to the demo source? If it is, what error are you getting?

I haven’t made any changes to the demo source (ie the .psyexp file), apart from when I tried changing the monitor specification as mentioned in my previous post. I just tried deleting PsychoPy from Applications, downloading and installing PsychoPy 2021.2.3, then running ‘Unpack Demos…’ and finally opening the newly generated .psyexp file. But I still get the same RuntimeError, ioHub startup failed, that I copy-pasted in full in my previous post.

The error means that the iohub process is timing out on start up on your macOS setup. iohub has 30 seconds to fully start up or the error you are getting will be raised. Not sure why this is happenning on your mac machine now, but not others (that we know of).

You mentioned the same experiment use to work on the mac computer you are testing with. What was the last version of psychopy that you had installed on the computer where iohub did start up OK?

Do you have any programs / processes running at the same time as the experiment which take up a lot of cpu?

Does this happen just with the eye tracking demo in Builder, or does it also happen if you try and run psychopy/demos/coder/iohub/keyboard.py from Coder or a terminal window?

thank you

You mentioned the same experiment use to work on the mac computer you are testing with. What was the last version of psychopy that you had installed on the computer where iohub did start up OK?

Not sure about the exact version, but it was 2021.1.x.

Do you have any programs / processes running at the same time as the experiment which take up a lot of cpu?

Nope, and I’ve tried with just PsychoPy (apart from regular background processes of course) running, and it still throws the same error.

Does this happen just with the eye tracking demo in Builder, or does it also happen if you try and run psychopy/demos/coder/iohub/keyboard.py from Coder or a terminal window?

You’re right, it appears that the issue isn’t just with eyetracking… I got the error below when trying to run the ‘feature demos->iohub->stroop_keyboard->stroop.py’ experiment (I don’t think the one you mentioned is bundled with the standard standalone demos?)

## Running: /Users/lowe/Desktop/psyp_temp/Feature Demos/iohub/stroop_keyboard/stroop.py ##
!! Error starting ioHub:  Error starting ioHub server: ioHub startup failed.  Exiting...
2.1116     WARNING     We strongly recommend you activate the PTB sound engine in PsychoPy prefs as the preferred audio engine. Its timing is vastly superior. Your prefs are currently set to use ['sounddevice', 'pyo', 'pygame'] (in that order).
ioHub Server Process Completed With Code:  -11
2021-09-15 15:12:50.627 python[46203:3432196] ApplePersistenceIgnoreState: Existing state will not be touched. New state will be written to /var/folders/t6/mpg802p51ls2467jcqw5mlgc0000gn/T/org.opensciencetools.psychopy.savedState
##### Experiment ended. #####

Strange. When I have time, I’ll try downgrading to an earlier PsychoPy version and see if that causes the same issue, then update this post.

Maybe start a new thread about the macOS startup issue if you have any more info. Thanks again

Hi, I have the same problem on my Mac Book Pro. Is there any latest progress on this issue?