DeToX: a package for infant-friendly Tobii eye-tracking

Hi all,
(hopefully it is the right category)

I’m a postdoc in developmental science and have been using PsychoPy to run my studies for several years. I’ve conducted multiple eye-tracking studies with Tobii hardware, and I’ve consistently found the integration between Tobii eye trackers and PsychoPy lacking—particularly for features tailored to infants and children.

As mentioned in a previous discussion, there’s a real need for these features in our field. Over the years, my colleagues (@FrancescPoli ) and I have relied on psychopy_tobii_infant, which provided helpful infant-friendly calibration and setup tools.

As we gained a better understanding of how to integrate PsychoPy and Tobii Research SDK, I decided to try extend this work and develop a more comprehensive package. Here’s our initial attempt: DeToX.

The core philosophy behind DeToX is to be:

  • Simple: No unnecessary complexity
  • PsychoPy-native: All dependencies are already part of PsychoPy and Tobii Research SDK
  • Well-documented: This is more of a promise. I plan to create comprehensive guides both on a DeToX site and also integrate them with DevStart, an open science site for developmental research that my colleagues working on.

I’m sharing this here before it’s ready (there are definitely bugs or things to improve and fix) in case anyone with would like to check it, contribute, or point out any major issues. This is my first attempt at developing a Python package, and I’ve been working on it in my free time alongside my research, so progress has been gradual. I’ve learned a lot through trial and error (with some AI assistance along the way).

Features I tried to implement

Calibration

  • Track box visualization: Display the Tobii track box to help experimenters center participants within the optimal tracking area
  • Experimenter-paced calibration: Full experimenter control over calibration stimuli presentation
    • Customizable animations: trill or zoom effects
  • Visual calibration results: Display calibration accuracy with dots or connecting lines showing gaze offset at each point
  • Flexible stimuli: Bundled (in next release) stimuli provided for quick setup, but users can supply their own custom calibration targets
  • Save/load calibration results: Calibration data can be saved to or loaded from file (with or without GUI)

Data Management (HDF5)

  • Dual data format options:
    • Raw format: Preserves data exactly as provided by Tobii Research SDK
    • Simplified format: Curated selection of the most important columns with coordinates transformed to match PsychoPy’s coordinate system
  • Event handling:
    • Events are time-aligned and saved as an additional column in the main data stream
    • Events are also stored separately in a dedicated HDF5 table for flexible analysis workflows
  • Session metadata: Recording parameters (sampling rate, illumination mode, etc.) are automatically stored as HDF5 metadata
  • Flexible saving options:
    • Save all data at session end
    • Manually save at multiple timepoints—new data automatically appends to the same file

Simulation Mode

  • Mouse-based testing: Use mouse position to simulate gaze data for testing and debugging—works with most DeToX functions, allowing you to develop and test experiments without needing eye-tracking hardware

Gaze-Contingent Functionality

  • Real-time gaze processing: Activate gaze-contingent features to extract median, mean, or most recent sample from a rolling window of configurable size

Settings & Customization

  • Configuration options: Customize colors, sizes, timing parameters, and many other aspects of calibration and display
4 Likes

This sounds like a cool and useful extension. Let us know if you need help with anything (e.g. making Builder Components to help less-technical users?)

Best wishes,
Jon

1 Like

I’m not sure if I’m following any PsychoPy best practices at the moment.
Now I am focusing on making sure it works properly (again, I’m definitely no code expert). Once it’s reliable, I’d love to improve how it interacts with PsychoPy—for example around Builder components.

Yes, agreed. To be honest that’s how we operate too - we start with the code in a script, then the code in a Code Component, and as that solidifies and becomes boilerplate we move it to a Component for easier re-use

1 Like

Thank you for sharing this - we have spoken with so many infant researchers using PsychoPy who would benefit from a package like this - I am putting it on my to do list to test out what you have so far when I next have the Tobii in my hands!

As Jon said - let us know if we can help in any way - I’ll report back here once I have delved into the repository you linked more!

Becca

Yes, that’s exactly why I started coding it. We’ve given workshops on using tobii_research for eye-tracking, but a simpler package would definitely be welcome.

Please give it a try! I’m working on it in my free time alongside the documentation, so progress is slow. The new beta now includes stimuli, making it easier to run a basic calibration. You can also test it using the mouse instead of the Tobii, though obviously the real challenge is getting the tobii interaction right.

If you get a chance to look at it, let me know if anything’s confusing. I’m keeping things as simple as possible since this is my first Python package.