Hi, I’m new to this forum (which I really like!), so nice to meet you all. I’m trying to make my Neon eye-tracker from Pupil Labs communicate with PsychoPy, with very bad results. I created a simple emotion recognition task in PsychoPy, and it works. The problems come when I try to integrate the eye-tracker. I give all the information below. I checked older posts in the forum, but I could not find anything similar (or maybe I missed it; in that case, I’m sorry).
OS: Windows 11
PsychoPy version: 2025.1.1 Py 3.10
Standard Standalone Installation? (y/n): Yes
Do you want it to also run online? (y/n): No
What are you trying to achieve?: Starting the eye-tracker recording using the EyeTracking component in PsychoPy, and populating the eye-tracker field of the HDF5 file (because even though it is created, it is empty).
What did you try to make it work?: I installed the Pupil Labs plugin for PsychoPy psychopy-eyetracker-pupil-labs 0.8.1. Other installed related packages are: pupil_apriltags 1.0.4.post11, pupil-labs-neon-recording 2.1.0, pupil-labs-realtime-api 1.8.0, and pupil-labs-video 1.0.9.The experiment is presented on an external monitor, ASUS VW22ATL (1680 x 1050, 59.95 HZ). PC and Neon Companion are connected through a phone hotspot.
The main problem is that Neon seems to work correctly on the Companion side and via LSL, but PsychoPy does not receive/write eyetracker samples. The HDF5 file is created, but:
- data_collection/events/eyetracker/MonocularEyeSampleEvent is empty
- data_collection/events/eyetracker/BinocularEyeSampleEvent is also empty
When I tried to start the eye-tracker recording via the etRecord in PsychoPy, I noticed that using the Companion address neon.local did not work. However, using the direct Companion IP address does connect the eyetracker. The problem is that, even though the recording starts, it takes a very long time (there is a delay from the beginning of the experiment to the beginning of the eye-tracker recording of more or less 1 minute).
I also put the AprilTag Frame in all the routines of my experiment, and I defined the ROIs in my stimuli. However, as I mentioned before, when I look at the experiment outputs (HDF5, CSV), there is no data about eye-tracking, ROIs, or tags.
Therefore, I made some attempts to try to understand where the problem stems from. I describe them below.
1. Basic PsychoPy HDF5 test
I created a minimal PsychoPy experiment with:
- Eyetracker Record
- AprilTags
- Fixation cross
- Save HDF5 enabled
The result is that the HDF5 file is created, but the eyetracker section remains empty.
2. MouseGaze control test
I repeated the same minimal experiment using MouseGaze instead of Neon. The result is that the MonocularEyeSampleEvent is correctly populated.
3. Official Pupil Labs PsychoPy demo
I downloaded and ran the official gaze_contingent_demo.psyexp,and I observed that the demo does not progress when I look at the required area, as if no usable gaze were reaching PsychoPy.
4. Companion app checks
In the Companion app, scene video is visible, gaze overlay is visible in Preview, gaze mode is set to Binocular, and eye cameras are working. I made a short recording (~18 s), uploaded it to Cloud, and gaze + fixations are visible there. It seems that the tracker hardware and onboard gaze estimation seem to work correctly.
5. LSL checks
I enabled Stream over LSL in the Companion app. In LabRecorder, I can see both Neon Companion_Neon Gaze and Neon Companion_Neon Events. I also tested with Python (pylsl) and can receive live samples from Neon Companion_Neon Gaze (timestamps and changing x, y values are continuously received).Therefore, the live gaze stream seems to reach the PC successfully.
Based on all these checks, it seems that Neon hardware is working, Companion preview is working, Cloud recordings are working, LSL streaming to the PC is working, and PsychoPy HDF5 writing works in principle (because MouseGaze works). So, in my opinion, the remaining issue seems to be specifically in:
- the PsychoPy Neon plugin integration, and/or
- the conversion from Neon live gaze to PsychoPy screen-gaze / ioHub events on my setup
I don’t know if this problem could be solved via coding or in other ways. Did anyone run into the same problem and is able to help me, please? Thank you.
Beatrice